00:00:00.001 Started by upstream project "autotest-per-patch" build number 126246 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.022 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.023 The recommended git tool is: git 00:00:00.023 using credential 00000000-0000-0000-0000-000000000002 00:00:00.025 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.037 Fetching changes from the remote Git repository 00:00:00.040 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.059 Using shallow fetch with depth 1 00:00:00.059 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.059 > git --version # timeout=10 00:00:00.079 > git --version # 'git version 2.39.2' 00:00:00.079 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.107 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.107 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.914 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.924 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.934 Checking out Revision 7caca6989ac753a10259529aadac5754060382af (FETCH_HEAD) 00:00:02.935 > git config core.sparsecheckout # timeout=10 00:00:02.944 > git read-tree -mu HEAD # timeout=10 00:00:02.959 > git checkout -f 7caca6989ac753a10259529aadac5754060382af # timeout=5 00:00:02.976 Commit message: "jenkins/jjb-config: Purge centos leftovers" 00:00:02.976 > git rev-list --no-walk 7caca6989ac753a10259529aadac5754060382af # timeout=10 00:00:03.054 [Pipeline] Start of Pipeline 00:00:03.066 [Pipeline] library 00:00:03.067 Loading library shm_lib@master 00:00:03.067 Library shm_lib@master is cached. Copying from home. 00:00:03.081 [Pipeline] node 00:00:03.099 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.100 [Pipeline] { 00:00:03.110 [Pipeline] catchError 00:00:03.111 [Pipeline] { 00:00:03.120 [Pipeline] wrap 00:00:03.126 [Pipeline] { 00:00:03.131 [Pipeline] stage 00:00:03.132 [Pipeline] { (Prologue) 00:00:03.315 [Pipeline] sh 00:00:03.593 + logger -p user.info -t JENKINS-CI 00:00:03.613 [Pipeline] echo 00:00:03.614 Node: WFP50 00:00:03.623 [Pipeline] sh 00:00:03.916 [Pipeline] setCustomBuildProperty 00:00:03.928 [Pipeline] echo 00:00:03.929 Cleanup processes 00:00:03.933 [Pipeline] sh 00:00:04.213 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.213 2533023 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.225 [Pipeline] sh 00:00:04.503 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.503 ++ grep -v 'sudo pgrep' 00:00:04.503 ++ awk '{print $1}' 00:00:04.503 + sudo kill -9 00:00:04.503 + true 00:00:04.514 [Pipeline] cleanWs 00:00:04.522 [WS-CLEANUP] Deleting project workspace... 00:00:04.522 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.527 [WS-CLEANUP] done 00:00:04.530 [Pipeline] setCustomBuildProperty 00:00:04.542 [Pipeline] sh 00:00:04.820 + sudo git config --global --replace-all safe.directory '*' 00:00:04.918 [Pipeline] httpRequest 00:00:04.934 [Pipeline] echo 00:00:04.935 Sorcerer 10.211.164.101 is alive 00:00:04.942 [Pipeline] httpRequest 00:00:04.946 HttpMethod: GET 00:00:04.946 URL: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.947 Sending request to url: http://10.211.164.101/packages/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:04.953 Response Code: HTTP/1.1 200 OK 00:00:04.954 Success: Status code 200 is in the accepted range: 200,404 00:00:04.954 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:06.977 [Pipeline] sh 00:00:07.259 + tar --no-same-owner -xf jbp_7caca6989ac753a10259529aadac5754060382af.tar.gz 00:00:07.531 [Pipeline] httpRequest 00:00:07.555 [Pipeline] echo 00:00:07.556 Sorcerer 10.211.164.101 is alive 00:00:07.564 [Pipeline] httpRequest 00:00:07.568 HttpMethod: GET 00:00:07.568 URL: http://10.211.164.101/packages/spdk_4903ec64931a45acf5574198c8bdcba451fd8d13.tar.gz 00:00:07.569 Sending request to url: http://10.211.164.101/packages/spdk_4903ec64931a45acf5574198c8bdcba451fd8d13.tar.gz 00:00:07.585 Response Code: HTTP/1.1 200 OK 00:00:07.585 Success: Status code 200 is in the accepted range: 200,404 00:00:07.586 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_4903ec64931a45acf5574198c8bdcba451fd8d13.tar.gz 00:00:28.027 [Pipeline] sh 00:00:28.311 + tar --no-same-owner -xf spdk_4903ec64931a45acf5574198c8bdcba451fd8d13.tar.gz 00:00:32.514 [Pipeline] sh 00:00:32.798 + git -C spdk log --oneline -n5 00:00:32.798 4903ec649 ublk: use spdk_read_sysfs_attribute_uint32 to get max ublks 00:00:32.798 94c9ab717 util: add spdk_read_sysfs_attribute_uint32 00:00:32.798 a940d3681 util: add spdk_read_sysfs_attribute 00:00:32.798 f604975ba doc: fix deprecation.md typo 00:00:32.798 a95bbf233 blob: set parent_id properly on spdk_bs_blob_set_external_parent. 00:00:32.812 [Pipeline] } 00:00:32.829 [Pipeline] // stage 00:00:32.840 [Pipeline] stage 00:00:32.842 [Pipeline] { (Prepare) 00:00:32.862 [Pipeline] writeFile 00:00:32.882 [Pipeline] sh 00:00:33.164 + logger -p user.info -t JENKINS-CI 00:00:33.181 [Pipeline] sh 00:00:33.467 + logger -p user.info -t JENKINS-CI 00:00:33.479 [Pipeline] sh 00:00:33.759 + cat autorun-spdk.conf 00:00:33.759 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:33.759 SPDK_TEST_BLOCKDEV=1 00:00:33.759 SPDK_TEST_ISAL=1 00:00:33.759 SPDK_TEST_CRYPTO=1 00:00:33.759 SPDK_TEST_REDUCE=1 00:00:33.759 SPDK_TEST_VBDEV_COMPRESS=1 00:00:33.759 SPDK_RUN_UBSAN=1 00:00:33.766 RUN_NIGHTLY=0 00:00:33.773 [Pipeline] readFile 00:00:33.805 [Pipeline] withEnv 00:00:33.807 [Pipeline] { 00:00:33.849 [Pipeline] sh 00:00:34.132 + set -ex 00:00:34.132 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:34.132 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:34.132 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:34.132 ++ SPDK_TEST_BLOCKDEV=1 00:00:34.132 ++ SPDK_TEST_ISAL=1 00:00:34.132 ++ SPDK_TEST_CRYPTO=1 00:00:34.132 ++ SPDK_TEST_REDUCE=1 00:00:34.132 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:34.132 ++ SPDK_RUN_UBSAN=1 00:00:34.132 ++ RUN_NIGHTLY=0 00:00:34.132 + case $SPDK_TEST_NVMF_NICS in 00:00:34.132 + DRIVERS= 00:00:34.132 + [[ -n '' ]] 00:00:34.132 + exit 0 00:00:34.143 [Pipeline] } 00:00:34.163 [Pipeline] // withEnv 00:00:34.168 [Pipeline] } 00:00:34.185 [Pipeline] // stage 00:00:34.194 [Pipeline] catchError 00:00:34.196 [Pipeline] { 00:00:34.209 [Pipeline] timeout 00:00:34.209 Timeout set to expire in 40 min 00:00:34.210 [Pipeline] { 00:00:34.224 [Pipeline] stage 00:00:34.226 [Pipeline] { (Tests) 00:00:34.240 [Pipeline] sh 00:00:34.519 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:34.519 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:34.519 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:34.519 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:34.519 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:34.519 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:34.519 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:34.519 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:34.519 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:34.519 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:34.519 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:34.519 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:34.519 + source /etc/os-release 00:00:34.519 ++ NAME='Fedora Linux' 00:00:34.519 ++ VERSION='38 (Cloud Edition)' 00:00:34.519 ++ ID=fedora 00:00:34.519 ++ VERSION_ID=38 00:00:34.519 ++ VERSION_CODENAME= 00:00:34.519 ++ PLATFORM_ID=platform:f38 00:00:34.519 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:34.519 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:34.519 ++ LOGO=fedora-logo-icon 00:00:34.519 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:34.519 ++ HOME_URL=https://fedoraproject.org/ 00:00:34.519 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:34.519 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:34.519 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:34.519 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:34.519 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:34.519 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:34.519 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:34.519 ++ SUPPORT_END=2024-05-14 00:00:34.519 ++ VARIANT='Cloud Edition' 00:00:34.519 ++ VARIANT_ID=cloud 00:00:34.519 + uname -a 00:00:34.519 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:34.519 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:37.800 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:00:37.800 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:00:37.800 Hugepages 00:00:37.800 node hugesize free / total 00:00:37.800 node0 1048576kB 0 / 0 00:00:37.800 node0 2048kB 0 / 0 00:00:37.800 node1 1048576kB 0 / 0 00:00:37.800 node1 2048kB 0 / 0 00:00:37.800 00:00:37.800 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:37.800 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:37.800 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:37.800 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:37.800 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:37.800 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:37.800 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:37.800 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:37.800 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:37.800 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:00:37.800 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:37.800 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:37.800 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:37.800 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:37.800 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:37.800 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:37.800 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:37.800 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:37.800 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:00:37.800 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:00:37.800 + rm -f /tmp/spdk-ld-path 00:00:37.800 + source autorun-spdk.conf 00:00:37.800 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:37.800 ++ SPDK_TEST_BLOCKDEV=1 00:00:37.800 ++ SPDK_TEST_ISAL=1 00:00:37.800 ++ SPDK_TEST_CRYPTO=1 00:00:37.800 ++ SPDK_TEST_REDUCE=1 00:00:37.800 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:37.800 ++ SPDK_RUN_UBSAN=1 00:00:37.800 ++ RUN_NIGHTLY=0 00:00:37.800 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:37.800 + [[ -n '' ]] 00:00:37.800 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:38.058 + for M in /var/spdk/build-*-manifest.txt 00:00:38.058 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:38.058 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:38.058 + for M in /var/spdk/build-*-manifest.txt 00:00:38.058 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:38.058 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:38.058 ++ uname 00:00:38.058 + [[ Linux == \L\i\n\u\x ]] 00:00:38.058 + sudo dmesg -T 00:00:38.058 + sudo dmesg --clear 00:00:38.058 + dmesg_pid=2533995 00:00:38.058 + [[ Fedora Linux == FreeBSD ]] 00:00:38.058 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:38.058 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:38.058 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:38.058 + [[ -x /usr/src/fio-static/fio ]] 00:00:38.058 + export FIO_BIN=/usr/src/fio-static/fio 00:00:38.058 + FIO_BIN=/usr/src/fio-static/fio 00:00:38.058 + sudo dmesg -Tw 00:00:38.058 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:38.058 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:38.058 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:38.058 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:38.058 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:38.058 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:38.058 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:38.058 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:38.058 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:38.058 Test configuration: 00:00:38.058 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:38.058 SPDK_TEST_BLOCKDEV=1 00:00:38.058 SPDK_TEST_ISAL=1 00:00:38.058 SPDK_TEST_CRYPTO=1 00:00:38.058 SPDK_TEST_REDUCE=1 00:00:38.058 SPDK_TEST_VBDEV_COMPRESS=1 00:00:38.058 SPDK_RUN_UBSAN=1 00:00:38.058 RUN_NIGHTLY=0 22:29:22 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:38.058 22:29:22 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:38.058 22:29:22 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:38.058 22:29:22 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:38.058 22:29:22 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:38.058 22:29:22 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:38.058 22:29:22 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:38.058 22:29:22 -- paths/export.sh@5 -- $ export PATH 00:00:38.058 22:29:22 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:38.058 22:29:22 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:38.058 22:29:22 -- common/autobuild_common.sh@444 -- $ date +%s 00:00:38.058 22:29:22 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721075362.XXXXXX 00:00:38.058 22:29:22 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721075362.7insUP 00:00:38.058 22:29:22 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:00:38.058 22:29:22 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:00:38.058 22:29:22 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:38.058 22:29:22 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:38.058 22:29:22 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:38.058 22:29:22 -- common/autobuild_common.sh@460 -- $ get_config_params 00:00:38.058 22:29:22 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:38.058 22:29:22 -- common/autotest_common.sh@10 -- $ set +x 00:00:38.316 22:29:22 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:38.316 22:29:22 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:00:38.316 22:29:22 -- pm/common@17 -- $ local monitor 00:00:38.316 22:29:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:38.316 22:29:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:38.316 22:29:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:38.316 22:29:22 -- pm/common@21 -- $ date +%s 00:00:38.316 22:29:22 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:38.316 22:29:22 -- pm/common@21 -- $ date +%s 00:00:38.316 22:29:22 -- pm/common@25 -- $ sleep 1 00:00:38.316 22:29:22 -- pm/common@21 -- $ date +%s 00:00:38.316 22:29:23 -- pm/common@21 -- $ date +%s 00:00:38.316 22:29:23 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721075363 00:00:38.316 22:29:23 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721075363 00:00:38.316 22:29:23 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721075363 00:00:38.316 22:29:23 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721075363 00:00:38.317 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721075363_collect-vmstat.pm.log 00:00:38.317 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721075363_collect-cpu-load.pm.log 00:00:38.317 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721075363_collect-cpu-temp.pm.log 00:00:38.317 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721075363_collect-bmc-pm.bmc.pm.log 00:00:39.252 22:29:24 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:00:39.252 22:29:24 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:39.252 22:29:24 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:39.252 22:29:24 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:39.252 22:29:24 -- spdk/autobuild.sh@16 -- $ date -u 00:00:39.252 Mon Jul 15 08:29:24 PM UTC 2024 00:00:39.252 22:29:24 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:39.252 v24.09-pre-213-g4903ec649 00:00:39.252 22:29:24 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:39.252 22:29:24 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:39.252 22:29:24 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:39.252 22:29:24 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:00:39.252 22:29:24 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:00:39.252 22:29:24 -- common/autotest_common.sh@10 -- $ set +x 00:00:39.252 ************************************ 00:00:39.252 START TEST ubsan 00:00:39.252 ************************************ 00:00:39.252 22:29:24 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:00:39.252 using ubsan 00:00:39.252 00:00:39.252 real 0m0.001s 00:00:39.252 user 0m0.001s 00:00:39.252 sys 0m0.000s 00:00:39.252 22:29:24 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:00:39.252 22:29:24 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:39.252 ************************************ 00:00:39.252 END TEST ubsan 00:00:39.252 ************************************ 00:00:39.252 22:29:24 -- common/autotest_common.sh@1142 -- $ return 0 00:00:39.252 22:29:24 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:39.252 22:29:24 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:39.252 22:29:24 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:39.253 22:29:24 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:39.253 22:29:24 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:39.253 22:29:24 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:39.253 22:29:24 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:39.253 22:29:24 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:39.253 22:29:24 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:39.510 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:39.510 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:39.769 Using 'verbs' RDMA provider 00:00:56.092 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:10.976 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:10.976 Creating mk/config.mk...done. 00:01:10.976 Creating mk/cc.flags.mk...done. 00:01:10.976 Type 'make' to build. 00:01:10.976 22:29:54 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:10.976 22:29:54 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:10.976 22:29:54 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:10.976 22:29:54 -- common/autotest_common.sh@10 -- $ set +x 00:01:10.976 ************************************ 00:01:10.976 START TEST make 00:01:10.976 ************************************ 00:01:10.976 22:29:54 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:10.976 make[1]: Nothing to be done for 'all'. 00:01:57.835 The Meson build system 00:01:57.835 Version: 1.3.1 00:01:57.835 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:57.835 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:57.835 Build type: native build 00:01:57.835 Program cat found: YES (/usr/bin/cat) 00:01:57.835 Project name: DPDK 00:01:57.835 Project version: 24.03.0 00:01:57.835 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:57.835 C linker for the host machine: cc ld.bfd 2.39-16 00:01:57.835 Host machine cpu family: x86_64 00:01:57.835 Host machine cpu: x86_64 00:01:57.835 Message: ## Building in Developer Mode ## 00:01:57.835 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:57.835 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:57.835 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:57.835 Program python3 found: YES (/usr/bin/python3) 00:01:57.835 Program cat found: YES (/usr/bin/cat) 00:01:57.835 Compiler for C supports arguments -march=native: YES 00:01:57.835 Checking for size of "void *" : 8 00:01:57.835 Checking for size of "void *" : 8 (cached) 00:01:57.835 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:57.835 Library m found: YES 00:01:57.835 Library numa found: YES 00:01:57.835 Has header "numaif.h" : YES 00:01:57.835 Library fdt found: NO 00:01:57.835 Library execinfo found: NO 00:01:57.835 Has header "execinfo.h" : YES 00:01:57.835 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:57.835 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:57.835 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:57.835 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:57.835 Run-time dependency openssl found: YES 3.0.9 00:01:57.835 Run-time dependency libpcap found: YES 1.10.4 00:01:57.835 Has header "pcap.h" with dependency libpcap: YES 00:01:57.835 Compiler for C supports arguments -Wcast-qual: YES 00:01:57.835 Compiler for C supports arguments -Wdeprecated: YES 00:01:57.835 Compiler for C supports arguments -Wformat: YES 00:01:57.835 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:57.835 Compiler for C supports arguments -Wformat-security: NO 00:01:57.835 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:57.835 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:57.835 Compiler for C supports arguments -Wnested-externs: YES 00:01:57.835 Compiler for C supports arguments -Wold-style-definition: YES 00:01:57.835 Compiler for C supports arguments -Wpointer-arith: YES 00:01:57.835 Compiler for C supports arguments -Wsign-compare: YES 00:01:57.835 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:57.835 Compiler for C supports arguments -Wundef: YES 00:01:57.835 Compiler for C supports arguments -Wwrite-strings: YES 00:01:57.835 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:57.835 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:57.835 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:57.835 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:57.835 Program objdump found: YES (/usr/bin/objdump) 00:01:57.835 Compiler for C supports arguments -mavx512f: YES 00:01:57.835 Checking if "AVX512 checking" compiles: YES 00:01:57.835 Fetching value of define "__SSE4_2__" : 1 00:01:57.835 Fetching value of define "__AES__" : 1 00:01:57.835 Fetching value of define "__AVX__" : 1 00:01:57.835 Fetching value of define "__AVX2__" : 1 00:01:57.835 Fetching value of define "__AVX512BW__" : 1 00:01:57.835 Fetching value of define "__AVX512CD__" : 1 00:01:57.835 Fetching value of define "__AVX512DQ__" : 1 00:01:57.835 Fetching value of define "__AVX512F__" : 1 00:01:57.835 Fetching value of define "__AVX512VL__" : 1 00:01:57.835 Fetching value of define "__PCLMUL__" : 1 00:01:57.835 Fetching value of define "__RDRND__" : 1 00:01:57.835 Fetching value of define "__RDSEED__" : 1 00:01:57.835 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:57.835 Fetching value of define "__znver1__" : (undefined) 00:01:57.835 Fetching value of define "__znver2__" : (undefined) 00:01:57.835 Fetching value of define "__znver3__" : (undefined) 00:01:57.835 Fetching value of define "__znver4__" : (undefined) 00:01:57.835 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:57.835 Message: lib/log: Defining dependency "log" 00:01:57.835 Message: lib/kvargs: Defining dependency "kvargs" 00:01:57.835 Message: lib/telemetry: Defining dependency "telemetry" 00:01:57.835 Checking for function "getentropy" : NO 00:01:57.835 Message: lib/eal: Defining dependency "eal" 00:01:57.835 Message: lib/ring: Defining dependency "ring" 00:01:57.835 Message: lib/rcu: Defining dependency "rcu" 00:01:57.835 Message: lib/mempool: Defining dependency "mempool" 00:01:57.835 Message: lib/mbuf: Defining dependency "mbuf" 00:01:57.835 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:57.835 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:57.835 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:57.835 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:57.835 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:57.835 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:57.835 Compiler for C supports arguments -mpclmul: YES 00:01:57.835 Compiler for C supports arguments -maes: YES 00:01:57.835 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:57.835 Compiler for C supports arguments -mavx512bw: YES 00:01:57.835 Compiler for C supports arguments -mavx512dq: YES 00:01:57.835 Compiler for C supports arguments -mavx512vl: YES 00:01:57.835 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:57.835 Compiler for C supports arguments -mavx2: YES 00:01:57.835 Compiler for C supports arguments -mavx: YES 00:01:57.835 Message: lib/net: Defining dependency "net" 00:01:57.835 Message: lib/meter: Defining dependency "meter" 00:01:57.835 Message: lib/ethdev: Defining dependency "ethdev" 00:01:57.835 Message: lib/pci: Defining dependency "pci" 00:01:57.835 Message: lib/cmdline: Defining dependency "cmdline" 00:01:57.835 Message: lib/hash: Defining dependency "hash" 00:01:57.835 Message: lib/timer: Defining dependency "timer" 00:01:57.835 Message: lib/compressdev: Defining dependency "compressdev" 00:01:57.835 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:57.835 Message: lib/dmadev: Defining dependency "dmadev" 00:01:57.835 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:57.835 Message: lib/power: Defining dependency "power" 00:01:57.835 Message: lib/reorder: Defining dependency "reorder" 00:01:57.835 Message: lib/security: Defining dependency "security" 00:01:57.835 Has header "linux/userfaultfd.h" : YES 00:01:57.835 Has header "linux/vduse.h" : YES 00:01:57.835 Message: lib/vhost: Defining dependency "vhost" 00:01:57.835 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:57.835 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:57.835 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:57.835 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:57.835 Compiler for C supports arguments -std=c11: YES 00:01:57.835 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:57.835 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:57.835 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:57.835 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:57.835 Run-time dependency libmlx5 found: YES 1.24.44.0 00:01:57.835 Run-time dependency libibverbs found: YES 1.14.44.0 00:01:57.835 Library mtcr_ul found: NO 00:01:57.835 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:57.835 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:58.774 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:58.774 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:58.774 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:58.775 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:58.775 Configuring mlx5_autoconf.h using configuration 00:01:58.775 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:58.775 Run-time dependency libcrypto found: YES 3.0.9 00:01:58.775 Library IPSec_MB found: YES 00:01:58.775 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:58.775 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:58.775 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:58.775 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:58.775 Library IPSec_MB found: YES 00:01:58.775 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:58.775 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:58.775 Compiler for C supports arguments -std=c11: YES (cached) 00:01:58.775 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:58.775 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:58.775 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:58.775 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:58.775 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:58.775 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:58.775 Library libisal found: NO 00:01:58.775 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:58.775 Compiler for C supports arguments -std=c11: YES (cached) 00:01:58.775 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:58.775 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:58.775 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:58.775 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:58.775 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:58.775 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:58.775 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:58.775 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:58.775 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:58.775 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:58.775 Program doxygen found: YES (/usr/bin/doxygen) 00:01:58.775 Configuring doxy-api-html.conf using configuration 00:01:58.775 Configuring doxy-api-man.conf using configuration 00:01:58.775 Program mandb found: YES (/usr/bin/mandb) 00:01:58.775 Program sphinx-build found: NO 00:01:58.775 Configuring rte_build_config.h using configuration 00:01:58.775 Message: 00:01:58.775 ================= 00:01:58.775 Applications Enabled 00:01:58.775 ================= 00:01:58.775 00:01:58.775 apps: 00:01:58.775 00:01:58.775 00:01:58.775 Message: 00:01:58.775 ================= 00:01:58.775 Libraries Enabled 00:01:58.775 ================= 00:01:58.775 00:01:58.775 libs: 00:01:58.775 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:58.775 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:58.775 cryptodev, dmadev, power, reorder, security, vhost, 00:01:58.775 00:01:58.775 Message: 00:01:58.775 =============== 00:01:58.775 Drivers Enabled 00:01:58.775 =============== 00:01:58.775 00:01:58.775 common: 00:01:58.775 mlx5, qat, 00:01:58.775 bus: 00:01:58.775 auxiliary, pci, vdev, 00:01:58.775 mempool: 00:01:58.775 ring, 00:01:58.775 dma: 00:01:58.775 00:01:58.775 net: 00:01:58.775 00:01:58.775 crypto: 00:01:58.775 ipsec_mb, mlx5, 00:01:58.775 compress: 00:01:58.775 isal, mlx5, 00:01:58.775 vdpa: 00:01:58.775 00:01:58.775 00:01:58.775 Message: 00:01:58.775 ================= 00:01:58.775 Content Skipped 00:01:58.775 ================= 00:01:58.775 00:01:58.775 apps: 00:01:58.775 dumpcap: explicitly disabled via build config 00:01:58.775 graph: explicitly disabled via build config 00:01:58.775 pdump: explicitly disabled via build config 00:01:58.775 proc-info: explicitly disabled via build config 00:01:58.775 test-acl: explicitly disabled via build config 00:01:58.775 test-bbdev: explicitly disabled via build config 00:01:58.775 test-cmdline: explicitly disabled via build config 00:01:58.775 test-compress-perf: explicitly disabled via build config 00:01:58.775 test-crypto-perf: explicitly disabled via build config 00:01:58.775 test-dma-perf: explicitly disabled via build config 00:01:58.775 test-eventdev: explicitly disabled via build config 00:01:58.775 test-fib: explicitly disabled via build config 00:01:58.775 test-flow-perf: explicitly disabled via build config 00:01:58.775 test-gpudev: explicitly disabled via build config 00:01:58.775 test-mldev: explicitly disabled via build config 00:01:58.775 test-pipeline: explicitly disabled via build config 00:01:58.775 test-pmd: explicitly disabled via build config 00:01:58.775 test-regex: explicitly disabled via build config 00:01:58.775 test-sad: explicitly disabled via build config 00:01:58.775 test-security-perf: explicitly disabled via build config 00:01:58.775 00:01:58.775 libs: 00:01:58.775 argparse: explicitly disabled via build config 00:01:58.775 metrics: explicitly disabled via build config 00:01:58.775 acl: explicitly disabled via build config 00:01:58.775 bbdev: explicitly disabled via build config 00:01:58.775 bitratestats: explicitly disabled via build config 00:01:58.775 bpf: explicitly disabled via build config 00:01:58.775 cfgfile: explicitly disabled via build config 00:01:58.775 distributor: explicitly disabled via build config 00:01:58.775 efd: explicitly disabled via build config 00:01:58.775 eventdev: explicitly disabled via build config 00:01:58.775 dispatcher: explicitly disabled via build config 00:01:58.775 gpudev: explicitly disabled via build config 00:01:58.775 gro: explicitly disabled via build config 00:01:58.775 gso: explicitly disabled via build config 00:01:58.775 ip_frag: explicitly disabled via build config 00:01:58.775 jobstats: explicitly disabled via build config 00:01:58.775 latencystats: explicitly disabled via build config 00:01:58.775 lpm: explicitly disabled via build config 00:01:58.775 member: explicitly disabled via build config 00:01:58.775 pcapng: explicitly disabled via build config 00:01:58.775 rawdev: explicitly disabled via build config 00:01:58.775 regexdev: explicitly disabled via build config 00:01:58.775 mldev: explicitly disabled via build config 00:01:58.775 rib: explicitly disabled via build config 00:01:58.775 sched: explicitly disabled via build config 00:01:58.775 stack: explicitly disabled via build config 00:01:58.775 ipsec: explicitly disabled via build config 00:01:58.775 pdcp: explicitly disabled via build config 00:01:58.775 fib: explicitly disabled via build config 00:01:58.775 port: explicitly disabled via build config 00:01:58.775 pdump: explicitly disabled via build config 00:01:58.775 table: explicitly disabled via build config 00:01:58.775 pipeline: explicitly disabled via build config 00:01:58.775 graph: explicitly disabled via build config 00:01:58.775 node: explicitly disabled via build config 00:01:58.775 00:01:58.775 drivers: 00:01:58.775 common/cpt: not in enabled drivers build config 00:01:58.775 common/dpaax: not in enabled drivers build config 00:01:58.775 common/iavf: not in enabled drivers build config 00:01:58.775 common/idpf: not in enabled drivers build config 00:01:58.775 common/ionic: not in enabled drivers build config 00:01:58.775 common/mvep: not in enabled drivers build config 00:01:58.775 common/octeontx: not in enabled drivers build config 00:01:58.775 bus/cdx: not in enabled drivers build config 00:01:58.775 bus/dpaa: not in enabled drivers build config 00:01:58.775 bus/fslmc: not in enabled drivers build config 00:01:58.775 bus/ifpga: not in enabled drivers build config 00:01:58.775 bus/platform: not in enabled drivers build config 00:01:58.775 bus/uacce: not in enabled drivers build config 00:01:58.775 bus/vmbus: not in enabled drivers build config 00:01:58.775 common/cnxk: not in enabled drivers build config 00:01:58.775 common/nfp: not in enabled drivers build config 00:01:58.775 common/nitrox: not in enabled drivers build config 00:01:58.775 common/sfc_efx: not in enabled drivers build config 00:01:58.775 mempool/bucket: not in enabled drivers build config 00:01:58.775 mempool/cnxk: not in enabled drivers build config 00:01:58.775 mempool/dpaa: not in enabled drivers build config 00:01:58.775 mempool/dpaa2: not in enabled drivers build config 00:01:58.775 mempool/octeontx: not in enabled drivers build config 00:01:58.775 mempool/stack: not in enabled drivers build config 00:01:58.775 dma/cnxk: not in enabled drivers build config 00:01:58.775 dma/dpaa: not in enabled drivers build config 00:01:58.775 dma/dpaa2: not in enabled drivers build config 00:01:58.775 dma/hisilicon: not in enabled drivers build config 00:01:58.775 dma/idxd: not in enabled drivers build config 00:01:58.775 dma/ioat: not in enabled drivers build config 00:01:58.775 dma/skeleton: not in enabled drivers build config 00:01:58.775 net/af_packet: not in enabled drivers build config 00:01:58.775 net/af_xdp: not in enabled drivers build config 00:01:58.775 net/ark: not in enabled drivers build config 00:01:58.775 net/atlantic: not in enabled drivers build config 00:01:58.775 net/avp: not in enabled drivers build config 00:01:58.775 net/axgbe: not in enabled drivers build config 00:01:58.775 net/bnx2x: not in enabled drivers build config 00:01:58.775 net/bnxt: not in enabled drivers build config 00:01:58.775 net/bonding: not in enabled drivers build config 00:01:58.775 net/cnxk: not in enabled drivers build config 00:01:58.775 net/cpfl: not in enabled drivers build config 00:01:58.775 net/cxgbe: not in enabled drivers build config 00:01:58.775 net/dpaa: not in enabled drivers build config 00:01:58.775 net/dpaa2: not in enabled drivers build config 00:01:58.775 net/e1000: not in enabled drivers build config 00:01:58.775 net/ena: not in enabled drivers build config 00:01:58.775 net/enetc: not in enabled drivers build config 00:01:58.775 net/enetfec: not in enabled drivers build config 00:01:58.775 net/enic: not in enabled drivers build config 00:01:58.775 net/failsafe: not in enabled drivers build config 00:01:58.775 net/fm10k: not in enabled drivers build config 00:01:58.775 net/gve: not in enabled drivers build config 00:01:58.775 net/hinic: not in enabled drivers build config 00:01:58.775 net/hns3: not in enabled drivers build config 00:01:58.775 net/i40e: not in enabled drivers build config 00:01:58.775 net/iavf: not in enabled drivers build config 00:01:58.775 net/ice: not in enabled drivers build config 00:01:58.775 net/idpf: not in enabled drivers build config 00:01:58.775 net/igc: not in enabled drivers build config 00:01:58.775 net/ionic: not in enabled drivers build config 00:01:58.776 net/ipn3ke: not in enabled drivers build config 00:01:58.776 net/ixgbe: not in enabled drivers build config 00:01:58.776 net/mana: not in enabled drivers build config 00:01:58.776 net/memif: not in enabled drivers build config 00:01:58.776 net/mlx4: not in enabled drivers build config 00:01:58.776 net/mlx5: not in enabled drivers build config 00:01:58.776 net/mvneta: not in enabled drivers build config 00:01:58.776 net/mvpp2: not in enabled drivers build config 00:01:58.776 net/netvsc: not in enabled drivers build config 00:01:58.776 net/nfb: not in enabled drivers build config 00:01:58.776 net/nfp: not in enabled drivers build config 00:01:58.776 net/ngbe: not in enabled drivers build config 00:01:58.776 net/null: not in enabled drivers build config 00:01:58.776 net/octeontx: not in enabled drivers build config 00:01:58.776 net/octeon_ep: not in enabled drivers build config 00:01:58.776 net/pcap: not in enabled drivers build config 00:01:58.776 net/pfe: not in enabled drivers build config 00:01:58.776 net/qede: not in enabled drivers build config 00:01:58.776 net/ring: not in enabled drivers build config 00:01:58.776 net/sfc: not in enabled drivers build config 00:01:58.776 net/softnic: not in enabled drivers build config 00:01:58.776 net/tap: not in enabled drivers build config 00:01:58.776 net/thunderx: not in enabled drivers build config 00:01:58.776 net/txgbe: not in enabled drivers build config 00:01:58.776 net/vdev_netvsc: not in enabled drivers build config 00:01:58.776 net/vhost: not in enabled drivers build config 00:01:58.776 net/virtio: not in enabled drivers build config 00:01:58.776 net/vmxnet3: not in enabled drivers build config 00:01:58.776 raw/*: missing internal dependency, "rawdev" 00:01:58.776 crypto/armv8: not in enabled drivers build config 00:01:58.776 crypto/bcmfs: not in enabled drivers build config 00:01:58.776 crypto/caam_jr: not in enabled drivers build config 00:01:58.776 crypto/ccp: not in enabled drivers build config 00:01:58.776 crypto/cnxk: not in enabled drivers build config 00:01:58.776 crypto/dpaa_sec: not in enabled drivers build config 00:01:58.776 crypto/dpaa2_sec: not in enabled drivers build config 00:01:58.776 crypto/mvsam: not in enabled drivers build config 00:01:58.776 crypto/nitrox: not in enabled drivers build config 00:01:58.776 crypto/null: not in enabled drivers build config 00:01:58.776 crypto/octeontx: not in enabled drivers build config 00:01:58.776 crypto/openssl: not in enabled drivers build config 00:01:58.776 crypto/scheduler: not in enabled drivers build config 00:01:58.776 crypto/uadk: not in enabled drivers build config 00:01:58.776 crypto/virtio: not in enabled drivers build config 00:01:58.776 compress/nitrox: not in enabled drivers build config 00:01:58.776 compress/octeontx: not in enabled drivers build config 00:01:58.776 compress/zlib: not in enabled drivers build config 00:01:58.776 regex/*: missing internal dependency, "regexdev" 00:01:58.776 ml/*: missing internal dependency, "mldev" 00:01:58.776 vdpa/ifc: not in enabled drivers build config 00:01:58.776 vdpa/mlx5: not in enabled drivers build config 00:01:58.776 vdpa/nfp: not in enabled drivers build config 00:01:58.776 vdpa/sfc: not in enabled drivers build config 00:01:58.776 event/*: missing internal dependency, "eventdev" 00:01:58.776 baseband/*: missing internal dependency, "bbdev" 00:01:58.776 gpu/*: missing internal dependency, "gpudev" 00:01:58.776 00:01:58.776 00:01:59.343 Build targets in project: 115 00:01:59.343 00:01:59.343 DPDK 24.03.0 00:01:59.343 00:01:59.343 User defined options 00:01:59.343 buildtype : debug 00:01:59.343 default_library : shared 00:01:59.343 libdir : lib 00:01:59.343 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:59.343 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:59.343 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:59.343 cpu_instruction_set: native 00:01:59.343 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:01:59.343 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,argparse,pcapng,bbdev 00:01:59.343 enable_docs : false 00:01:59.343 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:59.343 enable_kmods : false 00:01:59.343 max_lcores : 128 00:01:59.343 tests : false 00:01:59.343 00:01:59.343 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:59.923 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:59.923 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:00.188 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:00.188 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:00.188 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:00.188 [5/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:00.188 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:00.188 [7/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:00.188 [8/378] Linking static target lib/librte_kvargs.a 00:02:00.188 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:00.188 [10/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:00.188 [11/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:00.188 [12/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:00.188 [13/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:00.188 [14/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:00.188 [15/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:00.188 [16/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:00.188 [17/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:00.188 [18/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:00.188 [19/378] Linking static target lib/librte_log.a 00:02:00.452 [20/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:00.452 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:00.452 [22/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.452 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:00.452 [24/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:00.452 [25/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:00.715 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:00.715 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:00.715 [28/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:00.715 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:00.715 [30/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:00.715 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:00.715 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:00.715 [33/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:00.715 [34/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:00.715 [35/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:00.715 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:00.715 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:00.715 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:00.715 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:00.715 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:00.715 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:00.715 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:00.715 [43/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:00.715 [44/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:00.715 [45/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:00.715 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:00.715 [47/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:00.715 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:00.715 [49/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:00.715 [50/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:00.715 [51/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:00.715 [52/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:00.715 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:00.715 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:00.715 [55/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:00.715 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:00.715 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:00.715 [58/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:00.715 [59/378] Linking static target lib/librte_telemetry.a 00:02:00.715 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:00.715 [61/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:00.715 [62/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:00.715 [63/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:00.715 [64/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:00.715 [65/378] Linking static target lib/librte_pci.a 00:02:00.715 [66/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:00.715 [67/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:00.715 [68/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:00.715 [69/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:00.715 [70/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:00.715 [71/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:00.715 [72/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:00.715 [73/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:00.715 [74/378] Linking static target lib/librte_ring.a 00:02:00.715 [75/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:00.715 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:00.715 [77/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:00.715 [78/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:00.715 [79/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:00.715 [80/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:00.715 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:00.715 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:00.715 [83/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:00.980 [84/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:00.980 [85/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:00.980 [86/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:00.980 [87/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:00.980 [88/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:00.980 [89/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:00.980 [90/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:00.980 [91/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:00.980 [92/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:00.980 [93/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:00.980 [94/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:00.980 [95/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:00.980 [96/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:00.980 [97/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:00.980 [98/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:00.980 [99/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:00.980 [100/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:00.980 [101/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:00.980 [102/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:00.980 [103/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:00.980 [104/378] Linking static target lib/librte_mempool.a 00:02:00.980 [105/378] Linking static target lib/librte_rcu.a 00:02:00.980 [106/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:00.980 [107/378] Linking target lib/librte_log.so.24.1 00:02:00.980 [108/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:01.247 [109/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:01.247 [110/378] Linking static target lib/librte_mbuf.a 00:02:01.247 [111/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:01.247 [112/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:01.247 [113/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:01.247 [114/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:01.247 [115/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:01.247 [116/378] Linking static target lib/librte_meter.a 00:02:01.247 [117/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:01.247 [118/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:01.247 [119/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:01.247 [120/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:01.247 [121/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.247 [122/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:01.247 [123/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:01.247 [124/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:01.247 [125/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:01.247 [126/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:01.247 [127/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:01.247 [128/378] Linking static target lib/librte_cmdline.a 00:02:01.509 [129/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:01.509 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:01.509 [131/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.509 [132/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:01.509 [133/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:01.509 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:01.509 [135/378] Linking target lib/librte_kvargs.so.24.1 00:02:01.510 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:01.510 [137/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:01.510 [138/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:01.510 [139/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:01.510 [140/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:01.510 [141/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:01.510 [142/378] Linking static target lib/librte_timer.a 00:02:01.510 [143/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:01.510 [144/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.510 [145/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:01.510 [146/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:01.510 [147/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:01.510 [148/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:01.510 [149/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:01.510 [150/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:01.510 [151/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:01.510 [152/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:01.510 [153/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:01.510 [154/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:01.510 [155/378] Linking static target lib/librte_eal.a 00:02:01.510 [156/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:01.510 [157/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:01.510 [158/378] Linking target lib/librte_telemetry.so.24.1 00:02:01.510 [159/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:01.510 [160/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:01.510 [161/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.510 [162/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:01.510 [163/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:01.510 [164/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:01.510 [165/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:01.510 [166/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:01.772 [167/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:01.772 [168/378] Linking static target lib/librte_dmadev.a 00:02:01.772 [169/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:01.772 [170/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:01.772 [171/378] Linking static target lib/librte_net.a 00:02:01.772 [172/378] Linking static target lib/librte_compressdev.a 00:02:01.772 [173/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:01.772 [174/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:01.772 [175/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:01.772 [176/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:01.772 [177/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:01.772 [178/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:01.772 [179/378] Linking static target lib/librte_power.a 00:02:01.772 [180/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:01.772 [181/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:01.772 [182/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:01.772 [183/378] Linking static target lib/librte_reorder.a 00:02:01.772 [184/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:01.772 [185/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:01.772 [186/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:01.772 [187/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:01.772 [188/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:01.772 [189/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:02.034 [190/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:02.034 [191/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:02.034 [192/378] Linking static target lib/librte_security.a 00:02:02.034 [193/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:02.034 [194/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:02.034 [195/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:02.034 [196/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:02.034 [197/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:02.034 [198/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:02.034 [199/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:02.034 [200/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:02.034 [201/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:02.034 [202/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:02.034 [203/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:02.034 [204/378] Linking static target lib/librte_hash.a 00:02:02.034 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:02.034 [206/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:02.034 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:02.034 [208/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:02.296 [209/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:02.296 [210/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.296 [211/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:02.296 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:02.296 [213/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.296 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:02.296 [215/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:02.296 [216/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:02.296 [217/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:02.296 [218/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.296 [219/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:02.296 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:02.296 [221/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:02.296 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:02.296 [223/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:02.296 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:02.296 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:02.296 [226/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:02.296 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:02.296 [228/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.296 [229/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:02.296 [230/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:02.296 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:02.296 [232/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:02.296 [233/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:02.296 [234/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:02.296 [235/378] Linking static target drivers/librte_bus_vdev.a 00:02:02.296 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:02.296 [237/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.297 [238/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:02.297 [239/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:02.297 [240/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:02.297 [241/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:02.297 [242/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.297 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:02.297 [244/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:02.556 [245/378] Linking static target lib/librte_cryptodev.a 00:02:02.556 [246/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:02.556 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:02.556 [248/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:02.556 [249/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.556 [250/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:02.556 [251/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.556 [252/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:02.556 [253/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:02.556 [254/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:02.556 [255/378] Linking static target drivers/librte_bus_pci.a 00:02:02.556 [256/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:02.556 [257/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:02.556 [258/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:02.556 [259/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:02.556 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:02.556 [261/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:02.556 [262/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.556 [263/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:02.816 [264/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:02.816 [265/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.816 [266/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:02.816 [267/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:02.816 [268/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.816 [269/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:02.816 [270/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:02.816 [271/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:02.816 [272/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:02.816 [273/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:02.816 [274/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:02.816 [275/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:02.816 [276/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:02.816 [277/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:02.816 [278/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:02.816 [279/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:02.816 [280/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:02.816 [281/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:02.816 [282/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:02.816 [283/378] Linking static target drivers/librte_mempool_ring.a 00:02:02.816 [284/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:02.816 [285/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:02.816 [286/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:02.816 [287/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:02.816 [288/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:02.816 [289/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:03.075 [290/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:03.075 [291/378] Linking static target lib/librte_ethdev.a 00:02:03.075 [292/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.075 [293/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:03.075 [294/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:03.075 [295/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:03.075 [296/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:03.075 [297/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:03.076 [298/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:03.076 [299/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:03.076 [300/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:03.076 [301/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:03.076 [302/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:03.076 [303/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:03.076 [304/378] Linking static target drivers/librte_common_mlx5.a 00:02:03.076 [305/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:03.076 [306/378] Linking static target drivers/librte_compress_isal.a 00:02:03.076 [307/378] Linking static target drivers/librte_compress_mlx5.a 00:02:03.076 [308/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:03.076 [309/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:03.335 [310/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:03.335 [311/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:03.335 [312/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:03.335 [313/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:03.335 [314/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:03.335 [315/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:03.335 [316/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:03.594 [317/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:03.853 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:03.853 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:04.112 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:04.112 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:04.112 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:04.112 [323/378] Linking static target drivers/librte_common_qat.a 00:02:04.373 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:04.633 [325/378] Linking static target lib/librte_vhost.a 00:02:04.633 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.170 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.703 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.989 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.916 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.916 [331/378] Linking target lib/librte_eal.so.24.1 00:02:15.174 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:15.174 [333/378] Linking target lib/librte_pci.so.24.1 00:02:15.174 [334/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:15.174 [335/378] Linking target lib/librte_timer.so.24.1 00:02:15.174 [336/378] Linking target lib/librte_meter.so.24.1 00:02:15.174 [337/378] Linking target lib/librte_ring.so.24.1 00:02:15.174 [338/378] Linking target lib/librte_dmadev.so.24.1 00:02:15.174 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:15.174 [340/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:15.174 [341/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:15.174 [342/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:15.174 [343/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:15.174 [344/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:15.174 [345/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:15.174 [346/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:15.432 [347/378] Linking target lib/librte_mempool.so.24.1 00:02:15.432 [348/378] Linking target lib/librte_rcu.so.24.1 00:02:15.432 [349/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:15.432 [350/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:15.432 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:15.432 [352/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:15.432 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:15.432 [354/378] Linking target lib/librte_mbuf.so.24.1 00:02:15.691 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:15.691 [356/378] Linking target lib/librte_reorder.so.24.1 00:02:15.691 [357/378] Linking target lib/librte_compressdev.so.24.1 00:02:15.691 [358/378] Linking target lib/librte_cryptodev.so.24.1 00:02:15.691 [359/378] Linking target lib/librte_net.so.24.1 00:02:15.950 [360/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:15.950 [361/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:15.950 [362/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:15.950 [363/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:15.950 [364/378] Linking target lib/librte_hash.so.24.1 00:02:15.950 [365/378] Linking target lib/librte_security.so.24.1 00:02:15.950 [366/378] Linking target lib/librte_cmdline.so.24.1 00:02:16.209 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:16.209 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:16.209 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:16.209 [370/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:16.209 [371/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:16.468 [372/378] Linking target lib/librte_power.so.24.1 00:02:16.468 [373/378] Linking target lib/librte_vhost.so.24.1 00:02:16.468 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:16.468 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:16.468 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:02:16.468 [377/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:16.468 [378/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:16.468 INFO: autodetecting backend as ninja 00:02:16.468 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:19.023 CC lib/log/log.o 00:02:19.023 CC lib/log/log_flags.o 00:02:19.023 CC lib/log/log_deprecated.o 00:02:19.023 CC lib/ut/ut.o 00:02:19.023 CC lib/ut_mock/mock.o 00:02:19.023 LIB libspdk_log.a 00:02:19.023 LIB libspdk_ut_mock.a 00:02:19.023 LIB libspdk_ut.a 00:02:19.023 SO libspdk_ut_mock.so.6.0 00:02:19.023 SO libspdk_log.so.7.0 00:02:19.023 SO libspdk_ut.so.2.0 00:02:19.023 SYMLINK libspdk_ut_mock.so 00:02:19.023 SYMLINK libspdk_log.so 00:02:19.023 SYMLINK libspdk_ut.so 00:02:19.282 CC lib/dma/dma.o 00:02:19.282 CC lib/util/base64.o 00:02:19.282 CC lib/util/bit_array.o 00:02:19.282 CC lib/util/cpuset.o 00:02:19.282 CC lib/util/crc32.o 00:02:19.282 CC lib/ioat/ioat.o 00:02:19.282 CC lib/util/crc16.o 00:02:19.282 CC lib/util/crc32c.o 00:02:19.282 CC lib/util/crc32_ieee.o 00:02:19.282 CC lib/util/crc64.o 00:02:19.282 CC lib/util/dif.o 00:02:19.282 CC lib/util/fd.o 00:02:19.282 CC lib/util/file.o 00:02:19.282 CC lib/util/hexlify.o 00:02:19.282 CC lib/util/iov.o 00:02:19.282 CC lib/util/math.o 00:02:19.282 CC lib/util/pipe.o 00:02:19.282 CC lib/util/string.o 00:02:19.282 CXX lib/trace_parser/trace.o 00:02:19.282 CC lib/util/strerror_tls.o 00:02:19.282 CC lib/util/uuid.o 00:02:19.282 CC lib/util/fd_group.o 00:02:19.282 CC lib/util/xor.o 00:02:19.282 CC lib/util/zipf.o 00:02:19.541 CC lib/vfio_user/host/vfio_user.o 00:02:19.541 CC lib/vfio_user/host/vfio_user_pci.o 00:02:19.541 LIB libspdk_dma.a 00:02:19.541 SO libspdk_dma.so.4.0 00:02:19.541 LIB libspdk_ioat.a 00:02:19.541 SYMLINK libspdk_dma.so 00:02:19.801 SO libspdk_ioat.so.7.0 00:02:19.801 SYMLINK libspdk_ioat.so 00:02:19.801 LIB libspdk_vfio_user.a 00:02:19.801 SO libspdk_vfio_user.so.5.0 00:02:19.801 LIB libspdk_util.a 00:02:19.801 SYMLINK libspdk_vfio_user.so 00:02:20.059 SO libspdk_util.so.9.1 00:02:20.059 LIB libspdk_trace_parser.a 00:02:20.059 SO libspdk_trace_parser.so.5.0 00:02:20.059 SYMLINK libspdk_util.so 00:02:20.318 SYMLINK libspdk_trace_parser.so 00:02:20.576 CC lib/rdma_provider/common.o 00:02:20.576 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:20.576 CC lib/json/json_parse.o 00:02:20.576 CC lib/json/json_util.o 00:02:20.576 CC lib/json/json_write.o 00:02:20.577 CC lib/env_dpdk/env.o 00:02:20.577 CC lib/env_dpdk/memory.o 00:02:20.577 CC lib/vmd/vmd.o 00:02:20.577 CC lib/vmd/led.o 00:02:20.577 CC lib/conf/conf.o 00:02:20.577 CC lib/env_dpdk/init.o 00:02:20.577 CC lib/idxd/idxd.o 00:02:20.577 CC lib/env_dpdk/pci.o 00:02:20.577 CC lib/idxd/idxd_user.o 00:02:20.577 CC lib/idxd/idxd_kernel.o 00:02:20.577 CC lib/env_dpdk/threads.o 00:02:20.577 CC lib/rdma_utils/rdma_utils.o 00:02:20.577 CC lib/env_dpdk/pci_ioat.o 00:02:20.577 CC lib/reduce/reduce.o 00:02:20.577 CC lib/env_dpdk/pci_virtio.o 00:02:20.577 CC lib/env_dpdk/pci_vmd.o 00:02:20.577 CC lib/env_dpdk/pci_idxd.o 00:02:20.577 CC lib/env_dpdk/pci_event.o 00:02:20.577 CC lib/env_dpdk/sigbus_handler.o 00:02:20.577 CC lib/env_dpdk/pci_dpdk.o 00:02:20.577 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:20.577 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:20.836 LIB libspdk_rdma_provider.a 00:02:20.836 SO libspdk_rdma_provider.so.6.0 00:02:20.836 LIB libspdk_conf.a 00:02:20.836 SO libspdk_conf.so.6.0 00:02:20.836 LIB libspdk_rdma_utils.a 00:02:20.836 SYMLINK libspdk_rdma_provider.so 00:02:20.836 SO libspdk_rdma_utils.so.1.0 00:02:20.836 SYMLINK libspdk_conf.so 00:02:21.095 SYMLINK libspdk_rdma_utils.so 00:02:21.095 LIB libspdk_idxd.a 00:02:21.095 SO libspdk_idxd.so.12.0 00:02:21.095 LIB libspdk_json.a 00:02:21.355 LIB libspdk_reduce.a 00:02:21.355 LIB libspdk_vmd.a 00:02:21.355 SYMLINK libspdk_idxd.so 00:02:21.355 SO libspdk_vmd.so.6.0 00:02:21.355 SO libspdk_json.so.6.0 00:02:21.355 SO libspdk_reduce.so.6.0 00:02:21.355 SYMLINK libspdk_reduce.so 00:02:21.355 SYMLINK libspdk_vmd.so 00:02:21.355 SYMLINK libspdk_json.so 00:02:21.614 CC lib/jsonrpc/jsonrpc_server.o 00:02:21.614 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:21.614 CC lib/jsonrpc/jsonrpc_client.o 00:02:21.614 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:21.874 LIB libspdk_env_dpdk.a 00:02:22.133 LIB libspdk_jsonrpc.a 00:02:22.133 SO libspdk_env_dpdk.so.14.1 00:02:22.133 SO libspdk_jsonrpc.so.6.0 00:02:22.133 SYMLINK libspdk_jsonrpc.so 00:02:22.133 SYMLINK libspdk_env_dpdk.so 00:02:22.392 CC lib/rpc/rpc.o 00:02:22.652 LIB libspdk_rpc.a 00:02:22.912 SO libspdk_rpc.so.6.0 00:02:22.912 SYMLINK libspdk_rpc.so 00:02:23.172 CC lib/trace/trace.o 00:02:23.172 CC lib/trace/trace_flags.o 00:02:23.172 CC lib/trace/trace_rpc.o 00:02:23.172 CC lib/notify/notify.o 00:02:23.172 CC lib/notify/notify_rpc.o 00:02:23.172 CC lib/keyring/keyring_rpc.o 00:02:23.172 CC lib/keyring/keyring.o 00:02:23.505 LIB libspdk_keyring.a 00:02:23.505 LIB libspdk_trace.a 00:02:23.505 SO libspdk_keyring.so.1.0 00:02:23.505 SO libspdk_trace.so.10.0 00:02:23.505 LIB libspdk_notify.a 00:02:23.764 SYMLINK libspdk_keyring.so 00:02:23.764 SO libspdk_notify.so.6.0 00:02:23.764 SYMLINK libspdk_trace.so 00:02:23.764 SYMLINK libspdk_notify.so 00:02:24.023 CC lib/thread/thread.o 00:02:24.023 CC lib/sock/sock.o 00:02:24.023 CC lib/thread/iobuf.o 00:02:24.023 CC lib/sock/sock_rpc.o 00:02:24.963 LIB libspdk_sock.a 00:02:24.963 SO libspdk_sock.so.10.0 00:02:24.963 SYMLINK libspdk_sock.so 00:02:25.222 LIB libspdk_thread.a 00:02:25.222 SO libspdk_thread.so.10.1 00:02:25.222 SYMLINK libspdk_thread.so 00:02:25.480 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:25.480 CC lib/nvme/nvme_ctrlr.o 00:02:25.480 CC lib/nvme/nvme_ns_cmd.o 00:02:25.480 CC lib/nvme/nvme_fabric.o 00:02:25.480 CC lib/nvme/nvme_ns.o 00:02:25.480 CC lib/nvme/nvme_pcie_common.o 00:02:25.480 CC lib/nvme/nvme_pcie.o 00:02:25.480 CC lib/nvme/nvme_qpair.o 00:02:25.480 CC lib/nvme/nvme.o 00:02:25.480 CC lib/nvme/nvme_quirks.o 00:02:25.480 CC lib/nvme/nvme_transport.o 00:02:25.480 CC lib/nvme/nvme_discovery.o 00:02:25.480 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:25.480 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:25.480 CC lib/nvme/nvme_tcp.o 00:02:25.480 CC lib/nvme/nvme_opal.o 00:02:25.480 CC lib/nvme/nvme_io_msg.o 00:02:25.480 CC lib/nvme/nvme_poll_group.o 00:02:25.480 CC lib/nvme/nvme_zns.o 00:02:25.480 CC lib/nvme/nvme_stubs.o 00:02:25.480 CC lib/nvme/nvme_auth.o 00:02:25.480 CC lib/nvme/nvme_cuse.o 00:02:25.480 CC lib/nvme/nvme_rdma.o 00:02:25.738 CC lib/blob/blobstore.o 00:02:25.738 CC lib/accel/accel.o 00:02:25.738 CC lib/blob/request.o 00:02:25.738 CC lib/blob/blob_bs_dev.o 00:02:25.738 CC lib/accel/accel_rpc.o 00:02:25.738 CC lib/accel/accel_sw.o 00:02:25.738 CC lib/blob/zeroes.o 00:02:25.738 CC lib/init/json_config.o 00:02:25.738 CC lib/init/subsystem.o 00:02:25.738 CC lib/init/rpc.o 00:02:25.738 CC lib/init/subsystem_rpc.o 00:02:25.738 CC lib/virtio/virtio.o 00:02:25.738 CC lib/virtio/virtio_vhost_user.o 00:02:25.738 CC lib/virtio/virtio_vfio_user.o 00:02:25.738 CC lib/virtio/virtio_pci.o 00:02:25.996 LIB libspdk_init.a 00:02:25.996 LIB libspdk_virtio.a 00:02:25.996 SO libspdk_init.so.5.0 00:02:25.996 SO libspdk_virtio.so.7.0 00:02:26.254 SYMLINK libspdk_init.so 00:02:26.254 SYMLINK libspdk_virtio.so 00:02:26.512 CC lib/event/app.o 00:02:26.512 CC lib/event/reactor.o 00:02:26.512 CC lib/event/log_rpc.o 00:02:26.512 CC lib/event/app_rpc.o 00:02:26.512 CC lib/event/scheduler_static.o 00:02:26.769 LIB libspdk_accel.a 00:02:26.769 SO libspdk_accel.so.15.1 00:02:27.027 LIB libspdk_event.a 00:02:27.027 SYMLINK libspdk_accel.so 00:02:27.027 SO libspdk_event.so.14.0 00:02:27.027 SYMLINK libspdk_event.so 00:02:27.308 CC lib/bdev/bdev.o 00:02:27.308 CC lib/bdev/bdev_rpc.o 00:02:27.308 CC lib/bdev/bdev_zone.o 00:02:27.308 CC lib/bdev/part.o 00:02:27.308 CC lib/bdev/scsi_nvme.o 00:02:27.565 LIB libspdk_nvme.a 00:02:27.823 SO libspdk_nvme.so.13.1 00:02:28.081 SYMLINK libspdk_nvme.so 00:02:29.016 LIB libspdk_blob.a 00:02:29.274 SO libspdk_blob.so.11.0 00:02:29.274 SYMLINK libspdk_blob.so 00:02:29.532 CC lib/lvol/lvol.o 00:02:29.532 CC lib/blobfs/blobfs.o 00:02:29.532 CC lib/blobfs/tree.o 00:02:30.100 LIB libspdk_bdev.a 00:02:30.100 SO libspdk_bdev.so.15.1 00:02:30.100 SYMLINK libspdk_bdev.so 00:02:30.674 LIB libspdk_blobfs.a 00:02:30.674 CC lib/scsi/dev.o 00:02:30.674 CC lib/scsi/lun.o 00:02:30.674 CC lib/scsi/port.o 00:02:30.674 CC lib/scsi/scsi_bdev.o 00:02:30.674 CC lib/scsi/scsi.o 00:02:30.674 CC lib/scsi/scsi_pr.o 00:02:30.674 CC lib/scsi/scsi_rpc.o 00:02:30.674 CC lib/scsi/task.o 00:02:30.674 CC lib/ublk/ublk.o 00:02:30.674 CC lib/ublk/ublk_rpc.o 00:02:30.674 CC lib/ftl/ftl_core.o 00:02:30.674 CC lib/ftl/ftl_init.o 00:02:30.674 CC lib/ftl/ftl_layout.o 00:02:30.674 CC lib/ftl/ftl_debug.o 00:02:30.674 CC lib/nbd/nbd.o 00:02:30.674 CC lib/ftl/ftl_io.o 00:02:30.674 CC lib/ftl/ftl_sb.o 00:02:30.674 CC lib/nbd/nbd_rpc.o 00:02:30.674 CC lib/ftl/ftl_l2p.o 00:02:30.674 CC lib/ftl/ftl_l2p_flat.o 00:02:30.674 CC lib/nvmf/ctrlr.o 00:02:30.674 CC lib/nvmf/ctrlr_discovery.o 00:02:30.674 CC lib/ftl/ftl_nv_cache.o 00:02:30.674 CC lib/ftl/ftl_band.o 00:02:30.674 CC lib/nvmf/ctrlr_bdev.o 00:02:30.674 CC lib/ftl/ftl_band_ops.o 00:02:30.674 CC lib/nvmf/nvmf.o 00:02:30.674 CC lib/nvmf/subsystem.o 00:02:30.674 CC lib/ftl/ftl_rq.o 00:02:30.674 CC lib/ftl/ftl_writer.o 00:02:30.674 SO libspdk_blobfs.so.10.0 00:02:30.674 CC lib/nvmf/nvmf_rpc.o 00:02:30.674 CC lib/nvmf/transport.o 00:02:30.674 CC lib/ftl/ftl_reloc.o 00:02:30.674 CC lib/ftl/ftl_p2l.o 00:02:30.674 CC lib/nvmf/stubs.o 00:02:30.674 CC lib/nvmf/tcp.o 00:02:30.674 CC lib/ftl/ftl_l2p_cache.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:30.674 CC lib/nvmf/mdns_server.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:30.674 CC lib/nvmf/rdma.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:30.674 CC lib/nvmf/auth.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:30.674 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:30.675 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:30.675 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:30.675 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:30.675 CC lib/ftl/utils/ftl_conf.o 00:02:30.675 CC lib/ftl/utils/ftl_md.o 00:02:30.675 CC lib/ftl/utils/ftl_mempool.o 00:02:30.675 CC lib/ftl/utils/ftl_bitmap.o 00:02:30.675 CC lib/ftl/utils/ftl_property.o 00:02:30.675 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:30.675 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:30.675 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:30.675 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:30.675 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:30.675 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:30.675 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:30.675 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:30.675 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:30.675 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:30.675 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:30.675 CC lib/ftl/base/ftl_base_dev.o 00:02:30.675 LIB libspdk_lvol.a 00:02:30.675 SO libspdk_lvol.so.10.0 00:02:30.675 SYMLINK libspdk_blobfs.so 00:02:30.675 CC lib/ftl/base/ftl_base_bdev.o 00:02:30.933 SYMLINK libspdk_lvol.so 00:02:30.933 CC lib/ftl/ftl_trace.o 00:02:31.192 LIB libspdk_scsi.a 00:02:31.192 LIB libspdk_nbd.a 00:02:31.450 SO libspdk_scsi.so.9.0 00:02:31.450 SO libspdk_nbd.so.7.0 00:02:31.450 SYMLINK libspdk_scsi.so 00:02:31.450 SYMLINK libspdk_nbd.so 00:02:31.450 LIB libspdk_ublk.a 00:02:31.450 SO libspdk_ublk.so.3.0 00:02:31.708 SYMLINK libspdk_ublk.so 00:02:31.708 LIB libspdk_ftl.a 00:02:31.708 CC lib/iscsi/conn.o 00:02:31.708 CC lib/iscsi/init_grp.o 00:02:31.709 CC lib/iscsi/iscsi.o 00:02:31.709 CC lib/iscsi/md5.o 00:02:31.709 CC lib/iscsi/param.o 00:02:31.709 CC lib/iscsi/portal_grp.o 00:02:31.709 CC lib/iscsi/tgt_node.o 00:02:31.709 CC lib/iscsi/iscsi_subsystem.o 00:02:31.709 CC lib/iscsi/iscsi_rpc.o 00:02:31.709 CC lib/iscsi/task.o 00:02:31.709 CC lib/vhost/vhost.o 00:02:31.709 CC lib/vhost/vhost_rpc.o 00:02:31.709 CC lib/vhost/vhost_scsi.o 00:02:31.709 CC lib/vhost/vhost_blk.o 00:02:31.709 CC lib/vhost/rte_vhost_user.o 00:02:31.967 SO libspdk_ftl.so.9.0 00:02:32.535 SYMLINK libspdk_ftl.so 00:02:32.794 LIB libspdk_iscsi.a 00:02:32.794 LIB libspdk_nvmf.a 00:02:32.794 SO libspdk_iscsi.so.8.0 00:02:33.054 LIB libspdk_vhost.a 00:02:33.054 SO libspdk_nvmf.so.19.0 00:02:33.054 SO libspdk_vhost.so.8.0 00:02:33.054 SYMLINK libspdk_vhost.so 00:02:33.314 SYMLINK libspdk_nvmf.so 00:02:33.314 SYMLINK libspdk_iscsi.so 00:02:33.884 CC module/env_dpdk/env_dpdk_rpc.o 00:02:34.143 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:34.143 CC module/keyring/file/keyring.o 00:02:34.143 CC module/keyring/file/keyring_rpc.o 00:02:34.143 CC module/accel/error/accel_error.o 00:02:34.143 CC module/accel/error/accel_error_rpc.o 00:02:34.143 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:34.143 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:34.143 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:34.143 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:34.143 CC module/blob/bdev/blob_bdev.o 00:02:34.143 CC module/sock/posix/posix.o 00:02:34.143 CC module/accel/ioat/accel_ioat.o 00:02:34.143 CC module/accel/ioat/accel_ioat_rpc.o 00:02:34.143 CC module/scheduler/gscheduler/gscheduler.o 00:02:34.143 LIB libspdk_env_dpdk_rpc.a 00:02:34.143 CC module/keyring/linux/keyring.o 00:02:34.143 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:34.143 CC module/accel/dsa/accel_dsa.o 00:02:34.143 CC module/accel/dsa/accel_dsa_rpc.o 00:02:34.143 CC module/keyring/linux/keyring_rpc.o 00:02:34.143 CC module/accel/iaa/accel_iaa.o 00:02:34.143 CC module/accel/iaa/accel_iaa_rpc.o 00:02:34.143 SO libspdk_env_dpdk_rpc.so.6.0 00:02:34.143 LIB libspdk_keyring_file.a 00:02:34.143 SYMLINK libspdk_env_dpdk_rpc.so 00:02:34.143 LIB libspdk_scheduler_gscheduler.a 00:02:34.143 LIB libspdk_keyring_linux.a 00:02:34.143 SO libspdk_keyring_file.so.1.0 00:02:34.402 LIB libspdk_scheduler_dpdk_governor.a 00:02:34.402 LIB libspdk_accel_error.a 00:02:34.402 LIB libspdk_scheduler_dynamic.a 00:02:34.402 SO libspdk_scheduler_gscheduler.so.4.0 00:02:34.402 SO libspdk_keyring_linux.so.1.0 00:02:34.402 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:34.402 SO libspdk_scheduler_dynamic.so.4.0 00:02:34.402 SO libspdk_accel_error.so.2.0 00:02:34.402 SYMLINK libspdk_keyring_file.so 00:02:34.402 LIB libspdk_accel_ioat.a 00:02:34.402 LIB libspdk_accel_dsa.a 00:02:34.402 SYMLINK libspdk_scheduler_gscheduler.so 00:02:34.402 LIB libspdk_blob_bdev.a 00:02:34.402 LIB libspdk_accel_iaa.a 00:02:34.402 SYMLINK libspdk_keyring_linux.so 00:02:34.402 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:34.402 SYMLINK libspdk_scheduler_dynamic.so 00:02:34.402 SO libspdk_blob_bdev.so.11.0 00:02:34.402 SO libspdk_accel_ioat.so.6.0 00:02:34.402 SYMLINK libspdk_accel_error.so 00:02:34.402 SO libspdk_accel_dsa.so.5.0 00:02:34.402 SO libspdk_accel_iaa.so.3.0 00:02:34.402 SYMLINK libspdk_blob_bdev.so 00:02:34.402 SYMLINK libspdk_accel_dsa.so 00:02:34.402 SYMLINK libspdk_accel_ioat.so 00:02:34.402 SYMLINK libspdk_accel_iaa.so 00:02:34.969 LIB libspdk_sock_posix.a 00:02:34.969 SO libspdk_sock_posix.so.6.0 00:02:34.969 CC module/bdev/aio/bdev_aio.o 00:02:34.969 CC module/bdev/error/vbdev_error_rpc.o 00:02:34.969 CC module/bdev/aio/bdev_aio_rpc.o 00:02:34.969 CC module/bdev/error/vbdev_error.o 00:02:34.969 CC module/bdev/malloc/bdev_malloc.o 00:02:34.969 CC module/bdev/delay/vbdev_delay.o 00:02:34.969 CC module/bdev/compress/vbdev_compress.o 00:02:34.969 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:34.969 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:34.969 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:34.969 CC module/bdev/passthru/vbdev_passthru.o 00:02:34.969 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:34.969 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:34.970 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:34.970 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:34.970 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:34.970 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:34.970 CC module/bdev/lvol/vbdev_lvol.o 00:02:34.970 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:34.970 CC module/bdev/null/bdev_null.o 00:02:34.970 CC module/bdev/gpt/gpt.o 00:02:34.970 CC module/bdev/iscsi/bdev_iscsi.o 00:02:34.970 CC module/bdev/gpt/vbdev_gpt.o 00:02:34.970 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:34.970 CC module/bdev/null/bdev_null_rpc.o 00:02:34.970 CC module/bdev/split/vbdev_split.o 00:02:34.970 CC module/bdev/raid/bdev_raid_rpc.o 00:02:34.970 CC module/bdev/raid/bdev_raid.o 00:02:34.970 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:34.970 CC module/bdev/ftl/bdev_ftl.o 00:02:34.970 CC module/bdev/split/vbdev_split_rpc.o 00:02:34.970 CC module/bdev/raid/bdev_raid_sb.o 00:02:34.970 CC module/bdev/raid/raid0.o 00:02:34.970 CC module/bdev/raid/raid1.o 00:02:34.970 CC module/bdev/raid/concat.o 00:02:34.970 CC module/blobfs/bdev/blobfs_bdev.o 00:02:34.970 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:34.970 CC module/bdev/nvme/bdev_nvme.o 00:02:34.970 CC module/bdev/nvme/nvme_rpc.o 00:02:34.970 CC module/bdev/nvme/bdev_mdns_client.o 00:02:34.970 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:34.970 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:34.970 CC module/bdev/nvme/vbdev_opal.o 00:02:34.970 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:34.970 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:34.970 CC module/bdev/crypto/vbdev_crypto.o 00:02:34.970 SYMLINK libspdk_sock_posix.so 00:02:35.228 LIB libspdk_accel_dpdk_compressdev.a 00:02:35.228 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:35.228 LIB libspdk_bdev_split.a 00:02:35.228 LIB libspdk_blobfs_bdev.a 00:02:35.228 LIB libspdk_bdev_gpt.a 00:02:35.228 SO libspdk_bdev_split.so.6.0 00:02:35.228 SO libspdk_blobfs_bdev.so.6.0 00:02:35.228 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:35.228 LIB libspdk_bdev_error.a 00:02:35.228 LIB libspdk_bdev_null.a 00:02:35.228 SO libspdk_bdev_gpt.so.6.0 00:02:35.228 LIB libspdk_bdev_aio.a 00:02:35.486 SO libspdk_bdev_null.so.6.0 00:02:35.486 SO libspdk_bdev_error.so.6.0 00:02:35.486 SYMLINK libspdk_bdev_split.so 00:02:35.486 LIB libspdk_bdev_zone_block.a 00:02:35.486 SO libspdk_bdev_aio.so.6.0 00:02:35.486 SYMLINK libspdk_blobfs_bdev.so 00:02:35.486 SYMLINK libspdk_bdev_gpt.so 00:02:35.486 SO libspdk_bdev_zone_block.so.6.0 00:02:35.486 SYMLINK libspdk_bdev_error.so 00:02:35.486 LIB libspdk_bdev_delay.a 00:02:35.486 SYMLINK libspdk_bdev_null.so 00:02:35.486 SO libspdk_bdev_delay.so.6.0 00:02:35.486 SYMLINK libspdk_bdev_aio.so 00:02:35.486 SYMLINK libspdk_bdev_zone_block.so 00:02:35.486 LIB libspdk_bdev_ftl.a 00:02:35.486 LIB libspdk_accel_dpdk_cryptodev.a 00:02:35.486 LIB libspdk_bdev_crypto.a 00:02:35.486 LIB libspdk_bdev_passthru.a 00:02:35.486 LIB libspdk_bdev_compress.a 00:02:35.486 SO libspdk_bdev_ftl.so.6.0 00:02:35.486 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:35.486 LIB libspdk_bdev_malloc.a 00:02:35.486 SO libspdk_bdev_crypto.so.6.0 00:02:35.486 SO libspdk_bdev_passthru.so.6.0 00:02:35.486 SYMLINK libspdk_bdev_delay.so 00:02:35.486 SO libspdk_bdev_compress.so.6.0 00:02:35.486 LIB libspdk_bdev_lvol.a 00:02:35.486 SO libspdk_bdev_malloc.so.6.0 00:02:35.486 SYMLINK libspdk_bdev_crypto.so 00:02:35.744 SYMLINK libspdk_bdev_ftl.so 00:02:35.744 SYMLINK libspdk_bdev_passthru.so 00:02:35.744 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:35.744 SO libspdk_bdev_lvol.so.6.0 00:02:35.744 SYMLINK libspdk_bdev_compress.so 00:02:35.744 SYMLINK libspdk_bdev_malloc.so 00:02:35.744 LIB libspdk_bdev_virtio.a 00:02:35.744 SYMLINK libspdk_bdev_lvol.so 00:02:35.744 SO libspdk_bdev_virtio.so.6.0 00:02:35.744 SYMLINK libspdk_bdev_virtio.so 00:02:35.744 LIB libspdk_bdev_iscsi.a 00:02:35.744 SO libspdk_bdev_iscsi.so.6.0 00:02:36.003 SYMLINK libspdk_bdev_iscsi.so 00:02:36.003 LIB libspdk_bdev_raid.a 00:02:36.262 SO libspdk_bdev_raid.so.6.0 00:02:36.262 SYMLINK libspdk_bdev_raid.so 00:02:37.641 LIB libspdk_bdev_nvme.a 00:02:37.641 SO libspdk_bdev_nvme.so.7.0 00:02:37.994 SYMLINK libspdk_bdev_nvme.so 00:02:38.562 CC module/event/subsystems/iobuf/iobuf.o 00:02:38.562 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:38.562 CC module/event/subsystems/sock/sock.o 00:02:38.562 CC module/event/subsystems/vmd/vmd.o 00:02:38.562 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:38.562 CC module/event/subsystems/keyring/keyring.o 00:02:38.562 CC module/event/subsystems/scheduler/scheduler.o 00:02:38.562 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:38.820 LIB libspdk_event_iobuf.a 00:02:38.820 LIB libspdk_event_sock.a 00:02:38.820 LIB libspdk_event_vmd.a 00:02:38.820 LIB libspdk_event_vhost_blk.a 00:02:38.820 LIB libspdk_event_keyring.a 00:02:38.820 SO libspdk_event_sock.so.5.0 00:02:38.820 SO libspdk_event_vmd.so.6.0 00:02:38.820 SO libspdk_event_iobuf.so.3.0 00:02:38.820 SO libspdk_event_keyring.so.1.0 00:02:38.820 SO libspdk_event_vhost_blk.so.3.0 00:02:38.820 SYMLINK libspdk_event_sock.so 00:02:39.079 SYMLINK libspdk_event_iobuf.so 00:02:39.079 SYMLINK libspdk_event_vhost_blk.so 00:02:39.079 SYMLINK libspdk_event_vmd.so 00:02:39.079 LIB libspdk_event_scheduler.a 00:02:39.079 SYMLINK libspdk_event_keyring.so 00:02:39.079 SO libspdk_event_scheduler.so.4.0 00:02:39.079 SYMLINK libspdk_event_scheduler.so 00:02:39.339 CC module/event/subsystems/accel/accel.o 00:02:39.598 LIB libspdk_event_accel.a 00:02:39.598 SO libspdk_event_accel.so.6.0 00:02:39.598 SYMLINK libspdk_event_accel.so 00:02:39.858 CC module/event/subsystems/bdev/bdev.o 00:02:40.117 LIB libspdk_event_bdev.a 00:02:40.117 SO libspdk_event_bdev.so.6.0 00:02:40.376 SYMLINK libspdk_event_bdev.so 00:02:40.636 CC module/event/subsystems/nbd/nbd.o 00:02:40.636 CC module/event/subsystems/ublk/ublk.o 00:02:40.636 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:40.636 CC module/event/subsystems/scsi/scsi.o 00:02:40.636 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:40.895 LIB libspdk_event_nbd.a 00:02:40.895 LIB libspdk_event_ublk.a 00:02:40.895 SO libspdk_event_nbd.so.6.0 00:02:40.895 SO libspdk_event_ublk.so.3.0 00:02:40.895 LIB libspdk_event_nvmf.a 00:02:40.895 SYMLINK libspdk_event_nbd.so 00:02:40.895 SYMLINK libspdk_event_ublk.so 00:02:40.895 SO libspdk_event_nvmf.so.6.0 00:02:40.895 LIB libspdk_event_scsi.a 00:02:41.155 SO libspdk_event_scsi.so.6.0 00:02:41.155 SYMLINK libspdk_event_nvmf.so 00:02:41.155 SYMLINK libspdk_event_scsi.so 00:02:41.414 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:41.414 CC module/event/subsystems/iscsi/iscsi.o 00:02:41.673 LIB libspdk_event_vhost_scsi.a 00:02:41.673 SO libspdk_event_vhost_scsi.so.3.0 00:02:41.673 LIB libspdk_event_iscsi.a 00:02:41.673 SO libspdk_event_iscsi.so.6.0 00:02:41.673 SYMLINK libspdk_event_vhost_scsi.so 00:02:41.933 SYMLINK libspdk_event_iscsi.so 00:02:41.933 SO libspdk.so.6.0 00:02:41.933 SYMLINK libspdk.so 00:02:42.508 CC test/rpc_client/rpc_client_test.o 00:02:42.508 CC app/spdk_nvme_perf/perf.o 00:02:42.508 TEST_HEADER include/spdk/accel.h 00:02:42.508 TEST_HEADER include/spdk/accel_module.h 00:02:42.508 TEST_HEADER include/spdk/barrier.h 00:02:42.508 TEST_HEADER include/spdk/assert.h 00:02:42.508 TEST_HEADER include/spdk/base64.h 00:02:42.508 CC app/spdk_top/spdk_top.o 00:02:42.508 TEST_HEADER include/spdk/bdev_module.h 00:02:42.508 TEST_HEADER include/spdk/bdev.h 00:02:42.508 TEST_HEADER include/spdk/bdev_zone.h 00:02:42.508 TEST_HEADER include/spdk/bit_array.h 00:02:42.508 CXX app/trace/trace.o 00:02:42.508 TEST_HEADER include/spdk/bit_pool.h 00:02:42.508 CC app/spdk_nvme_discover/discovery_aer.o 00:02:42.508 TEST_HEADER include/spdk/blob_bdev.h 00:02:42.508 CC app/trace_record/trace_record.o 00:02:42.508 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:42.508 CC app/spdk_nvme_identify/identify.o 00:02:42.508 TEST_HEADER include/spdk/blobfs.h 00:02:42.508 TEST_HEADER include/spdk/blob.h 00:02:42.508 TEST_HEADER include/spdk/conf.h 00:02:42.508 TEST_HEADER include/spdk/config.h 00:02:42.508 CC app/spdk_lspci/spdk_lspci.o 00:02:42.508 TEST_HEADER include/spdk/cpuset.h 00:02:42.508 TEST_HEADER include/spdk/crc32.h 00:02:42.508 TEST_HEADER include/spdk/crc16.h 00:02:42.508 TEST_HEADER include/spdk/crc64.h 00:02:42.508 TEST_HEADER include/spdk/dma.h 00:02:42.508 TEST_HEADER include/spdk/endian.h 00:02:42.508 TEST_HEADER include/spdk/env_dpdk.h 00:02:42.508 TEST_HEADER include/spdk/dif.h 00:02:42.508 TEST_HEADER include/spdk/env.h 00:02:42.508 TEST_HEADER include/spdk/event.h 00:02:42.508 TEST_HEADER include/spdk/fd_group.h 00:02:42.508 TEST_HEADER include/spdk/fd.h 00:02:42.508 TEST_HEADER include/spdk/file.h 00:02:42.508 TEST_HEADER include/spdk/ftl.h 00:02:42.508 TEST_HEADER include/spdk/gpt_spec.h 00:02:42.508 TEST_HEADER include/spdk/hexlify.h 00:02:42.508 TEST_HEADER include/spdk/histogram_data.h 00:02:42.508 TEST_HEADER include/spdk/idxd.h 00:02:42.508 TEST_HEADER include/spdk/idxd_spec.h 00:02:42.508 TEST_HEADER include/spdk/init.h 00:02:42.508 TEST_HEADER include/spdk/ioat.h 00:02:42.508 TEST_HEADER include/spdk/ioat_spec.h 00:02:42.508 TEST_HEADER include/spdk/iscsi_spec.h 00:02:42.508 TEST_HEADER include/spdk/json.h 00:02:42.508 TEST_HEADER include/spdk/jsonrpc.h 00:02:42.508 TEST_HEADER include/spdk/keyring.h 00:02:42.508 TEST_HEADER include/spdk/keyring_module.h 00:02:42.508 TEST_HEADER include/spdk/likely.h 00:02:42.508 TEST_HEADER include/spdk/log.h 00:02:42.508 TEST_HEADER include/spdk/lvol.h 00:02:42.508 TEST_HEADER include/spdk/memory.h 00:02:42.508 TEST_HEADER include/spdk/mmio.h 00:02:42.508 TEST_HEADER include/spdk/nbd.h 00:02:42.508 TEST_HEADER include/spdk/notify.h 00:02:42.508 TEST_HEADER include/spdk/nvme.h 00:02:42.508 TEST_HEADER include/spdk/nvme_intel.h 00:02:42.508 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:42.508 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:42.508 CC app/spdk_dd/spdk_dd.o 00:02:42.508 TEST_HEADER include/spdk/nvme_spec.h 00:02:42.508 TEST_HEADER include/spdk/nvme_zns.h 00:02:42.508 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:42.508 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:42.508 TEST_HEADER include/spdk/nvmf.h 00:02:42.508 TEST_HEADER include/spdk/nvmf_spec.h 00:02:42.508 TEST_HEADER include/spdk/nvmf_transport.h 00:02:42.508 TEST_HEADER include/spdk/opal.h 00:02:42.508 TEST_HEADER include/spdk/opal_spec.h 00:02:42.508 TEST_HEADER include/spdk/pci_ids.h 00:02:42.508 TEST_HEADER include/spdk/pipe.h 00:02:42.508 TEST_HEADER include/spdk/reduce.h 00:02:42.508 CC app/nvmf_tgt/nvmf_main.o 00:02:42.508 TEST_HEADER include/spdk/queue.h 00:02:42.508 TEST_HEADER include/spdk/rpc.h 00:02:42.508 TEST_HEADER include/spdk/scheduler.h 00:02:42.508 TEST_HEADER include/spdk/scsi.h 00:02:42.508 TEST_HEADER include/spdk/scsi_spec.h 00:02:42.508 TEST_HEADER include/spdk/sock.h 00:02:42.508 TEST_HEADER include/spdk/string.h 00:02:42.508 TEST_HEADER include/spdk/stdinc.h 00:02:42.508 TEST_HEADER include/spdk/thread.h 00:02:42.508 TEST_HEADER include/spdk/trace.h 00:02:42.508 TEST_HEADER include/spdk/trace_parser.h 00:02:42.508 TEST_HEADER include/spdk/tree.h 00:02:42.508 TEST_HEADER include/spdk/ublk.h 00:02:42.508 TEST_HEADER include/spdk/util.h 00:02:42.508 TEST_HEADER include/spdk/uuid.h 00:02:42.508 TEST_HEADER include/spdk/version.h 00:02:42.508 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:42.508 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:42.508 TEST_HEADER include/spdk/vhost.h 00:02:42.508 TEST_HEADER include/spdk/vmd.h 00:02:42.508 TEST_HEADER include/spdk/xor.h 00:02:42.508 TEST_HEADER include/spdk/zipf.h 00:02:42.508 CXX test/cpp_headers/accel.o 00:02:42.508 CXX test/cpp_headers/accel_module.o 00:02:42.508 CXX test/cpp_headers/assert.o 00:02:42.508 CXX test/cpp_headers/barrier.o 00:02:42.508 CXX test/cpp_headers/base64.o 00:02:42.508 CXX test/cpp_headers/bdev.o 00:02:42.508 CXX test/cpp_headers/bdev_module.o 00:02:42.508 CXX test/cpp_headers/bdev_zone.o 00:02:42.508 CXX test/cpp_headers/bit_array.o 00:02:42.508 CXX test/cpp_headers/bit_pool.o 00:02:42.508 CXX test/cpp_headers/blob_bdev.o 00:02:42.508 CXX test/cpp_headers/blobfs.o 00:02:42.508 CXX test/cpp_headers/blobfs_bdev.o 00:02:42.508 CXX test/cpp_headers/blob.o 00:02:42.508 CXX test/cpp_headers/conf.o 00:02:42.508 CXX test/cpp_headers/cpuset.o 00:02:42.508 CXX test/cpp_headers/crc16.o 00:02:42.509 CXX test/cpp_headers/config.o 00:02:42.509 CXX test/cpp_headers/crc32.o 00:02:42.509 CXX test/cpp_headers/crc64.o 00:02:42.509 CXX test/cpp_headers/dif.o 00:02:42.509 CXX test/cpp_headers/dma.o 00:02:42.509 CXX test/cpp_headers/endian.o 00:02:42.509 CXX test/cpp_headers/env_dpdk.o 00:02:42.509 CXX test/cpp_headers/env.o 00:02:42.509 CXX test/cpp_headers/event.o 00:02:42.509 CXX test/cpp_headers/fd_group.o 00:02:42.509 CXX test/cpp_headers/fd.o 00:02:42.509 CXX test/cpp_headers/file.o 00:02:42.509 CXX test/cpp_headers/ftl.o 00:02:42.509 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:42.509 CXX test/cpp_headers/histogram_data.o 00:02:42.509 CXX test/cpp_headers/gpt_spec.o 00:02:42.509 CXX test/cpp_headers/idxd_spec.o 00:02:42.509 CXX test/cpp_headers/hexlify.o 00:02:42.509 CXX test/cpp_headers/idxd.o 00:02:42.509 CXX test/cpp_headers/init.o 00:02:42.509 CXX test/cpp_headers/ioat_spec.o 00:02:42.509 CXX test/cpp_headers/ioat.o 00:02:42.509 CXX test/cpp_headers/iscsi_spec.o 00:02:42.509 CXX test/cpp_headers/jsonrpc.o 00:02:42.509 CXX test/cpp_headers/json.o 00:02:42.509 CXX test/cpp_headers/keyring.o 00:02:42.509 CC app/spdk_tgt/spdk_tgt.o 00:02:42.509 CC test/env/vtophys/vtophys.o 00:02:42.509 CC test/env/memory/memory_ut.o 00:02:42.509 CC examples/util/zipf/zipf.o 00:02:42.509 CXX test/cpp_headers/keyring_module.o 00:02:42.509 CC test/thread/poller_perf/poller_perf.o 00:02:42.509 CC examples/ioat/perf/perf.o 00:02:42.509 CC test/app/stub/stub.o 00:02:42.509 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:42.509 CC test/app/histogram_perf/histogram_perf.o 00:02:42.509 CC app/iscsi_tgt/iscsi_tgt.o 00:02:42.509 CC examples/ioat/verify/verify.o 00:02:42.509 CC test/app/jsoncat/jsoncat.o 00:02:42.509 CC test/env/pci/pci_ut.o 00:02:42.509 CC app/fio/nvme/fio_plugin.o 00:02:42.773 CC test/dma/test_dma/test_dma.o 00:02:42.774 CC test/app/bdev_svc/bdev_svc.o 00:02:42.774 LINK spdk_lspci 00:02:42.774 CC app/fio/bdev/fio_plugin.o 00:02:42.774 LINK rpc_client_test 00:02:43.037 LINK spdk_trace_record 00:02:43.037 CC test/env/mem_callbacks/mem_callbacks.o 00:02:43.037 LINK spdk_nvme_discover 00:02:43.037 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:43.037 CXX test/cpp_headers/likely.o 00:02:43.037 LINK interrupt_tgt 00:02:43.037 CXX test/cpp_headers/log.o 00:02:43.037 CXX test/cpp_headers/lvol.o 00:02:43.037 CXX test/cpp_headers/memory.o 00:02:43.037 LINK poller_perf 00:02:43.037 CXX test/cpp_headers/mmio.o 00:02:43.037 CXX test/cpp_headers/nbd.o 00:02:43.037 CXX test/cpp_headers/notify.o 00:02:43.037 CXX test/cpp_headers/nvme.o 00:02:43.037 LINK histogram_perf 00:02:43.037 LINK stub 00:02:43.037 CXX test/cpp_headers/nvme_intel.o 00:02:43.037 CXX test/cpp_headers/nvme_ocssd.o 00:02:43.037 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:43.037 CXX test/cpp_headers/nvme_spec.o 00:02:43.037 CXX test/cpp_headers/nvme_zns.o 00:02:43.037 LINK jsoncat 00:02:43.037 CXX test/cpp_headers/nvmf_cmd.o 00:02:43.037 CXX test/cpp_headers/nvmf.o 00:02:43.037 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:43.037 CXX test/cpp_headers/nvmf_spec.o 00:02:43.037 CXX test/cpp_headers/nvmf_transport.o 00:02:43.037 LINK spdk_tgt 00:02:43.037 CXX test/cpp_headers/opal.o 00:02:43.037 LINK vtophys 00:02:43.037 CXX test/cpp_headers/opal_spec.o 00:02:43.037 CXX test/cpp_headers/pci_ids.o 00:02:43.037 LINK nvmf_tgt 00:02:43.037 CXX test/cpp_headers/pipe.o 00:02:43.037 CXX test/cpp_headers/queue.o 00:02:43.037 CXX test/cpp_headers/reduce.o 00:02:43.037 LINK env_dpdk_post_init 00:02:43.037 CXX test/cpp_headers/rpc.o 00:02:43.305 CXX test/cpp_headers/scheduler.o 00:02:43.305 CXX test/cpp_headers/scsi.o 00:02:43.305 CXX test/cpp_headers/scsi_spec.o 00:02:43.305 CXX test/cpp_headers/sock.o 00:02:43.305 CXX test/cpp_headers/stdinc.o 00:02:43.305 CXX test/cpp_headers/string.o 00:02:43.305 CXX test/cpp_headers/trace.o 00:02:43.305 CXX test/cpp_headers/trace_parser.o 00:02:43.305 CXX test/cpp_headers/thread.o 00:02:43.305 CXX test/cpp_headers/tree.o 00:02:43.305 LINK iscsi_tgt 00:02:43.305 LINK ioat_perf 00:02:43.305 LINK verify 00:02:43.305 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:43.305 CXX test/cpp_headers/ublk.o 00:02:43.305 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:43.305 CXX test/cpp_headers/util.o 00:02:43.305 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:43.305 CXX test/cpp_headers/uuid.o 00:02:43.305 LINK bdev_svc 00:02:43.305 CXX test/cpp_headers/version.o 00:02:43.305 CXX test/cpp_headers/vfio_user_pci.o 00:02:43.305 CXX test/cpp_headers/vfio_user_spec.o 00:02:43.305 CXX test/cpp_headers/vhost.o 00:02:43.305 LINK spdk_dd 00:02:43.305 CXX test/cpp_headers/vmd.o 00:02:43.305 CXX test/cpp_headers/xor.o 00:02:43.305 CXX test/cpp_headers/zipf.o 00:02:43.564 LINK zipf 00:02:43.564 LINK spdk_trace 00:02:43.564 LINK pci_ut 00:02:43.564 LINK test_dma 00:02:43.823 LINK spdk_nvme_identify 00:02:43.823 CC test/event/reactor_perf/reactor_perf.o 00:02:43.823 CC test/event/app_repeat/app_repeat.o 00:02:43.823 CC test/event/event_perf/event_perf.o 00:02:43.823 LINK mem_callbacks 00:02:43.823 CC test/event/reactor/reactor.o 00:02:43.823 LINK spdk_bdev 00:02:43.823 LINK nvme_fuzz 00:02:43.823 CC test/event/scheduler/scheduler.o 00:02:43.823 LINK spdk_nvme_perf 00:02:43.823 LINK spdk_nvme 00:02:43.823 LINK spdk_top 00:02:43.823 LINK reactor_perf 00:02:44.082 LINK app_repeat 00:02:44.082 LINK event_perf 00:02:44.082 CC app/vhost/vhost.o 00:02:44.082 LINK reactor 00:02:44.082 LINK vhost_fuzz 00:02:44.082 CC examples/idxd/perf/perf.o 00:02:44.082 CC examples/vmd/lsvmd/lsvmd.o 00:02:44.082 CC examples/vmd/led/led.o 00:02:44.082 CC examples/sock/hello_world/hello_sock.o 00:02:44.082 LINK scheduler 00:02:44.082 CC examples/thread/thread/thread_ex.o 00:02:44.342 CC test/nvme/reset/reset.o 00:02:44.342 CC test/nvme/compliance/nvme_compliance.o 00:02:44.342 CC test/nvme/startup/startup.o 00:02:44.342 CC test/nvme/fdp/fdp.o 00:02:44.342 CC test/nvme/e2edp/nvme_dp.o 00:02:44.342 CC test/nvme/connect_stress/connect_stress.o 00:02:44.342 CC test/nvme/sgl/sgl.o 00:02:44.342 LINK led 00:02:44.342 CC test/nvme/aer/aer.o 00:02:44.342 CC test/nvme/boot_partition/boot_partition.o 00:02:44.342 CC test/nvme/fused_ordering/fused_ordering.o 00:02:44.342 CC test/nvme/overhead/overhead.o 00:02:44.342 CC test/nvme/err_injection/err_injection.o 00:02:44.342 CC test/nvme/reserve/reserve.o 00:02:44.342 CC test/nvme/simple_copy/simple_copy.o 00:02:44.342 CC test/nvme/cuse/cuse.o 00:02:44.342 LINK lsvmd 00:02:44.342 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:44.342 CC test/accel/dif/dif.o 00:02:44.342 CC test/blobfs/mkfs/mkfs.o 00:02:44.342 LINK memory_ut 00:02:44.342 LINK hello_sock 00:02:44.342 CC test/lvol/esnap/esnap.o 00:02:44.342 LINK boot_partition 00:02:44.342 LINK vhost 00:02:44.342 LINK idxd_perf 00:02:44.601 LINK err_injection 00:02:44.601 LINK fused_ordering 00:02:44.601 LINK doorbell_aers 00:02:44.601 LINK reserve 00:02:44.601 LINK mkfs 00:02:44.601 LINK connect_stress 00:02:44.601 LINK simple_copy 00:02:44.601 LINK nvme_dp 00:02:44.601 LINK reset 00:02:44.601 LINK sgl 00:02:44.601 LINK overhead 00:02:44.601 LINK fdp 00:02:44.601 LINK nvme_compliance 00:02:44.601 LINK aer 00:02:44.601 LINK startup 00:02:44.601 LINK thread 00:02:44.860 LINK dif 00:02:45.119 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:45.119 CC examples/nvme/arbitration/arbitration.o 00:02:45.119 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:45.119 CC examples/nvme/hotplug/hotplug.o 00:02:45.119 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:45.119 CC examples/nvme/hello_world/hello_world.o 00:02:45.119 CC examples/nvme/reconnect/reconnect.o 00:02:45.119 CC examples/nvme/abort/abort.o 00:02:45.119 LINK iscsi_fuzz 00:02:45.377 CC examples/accel/perf/accel_perf.o 00:02:45.377 LINK pmr_persistence 00:02:45.377 CC examples/blob/cli/blobcli.o 00:02:45.377 CC test/bdev/bdevio/bdevio.o 00:02:45.377 LINK cmb_copy 00:02:45.377 CC examples/blob/hello_world/hello_blob.o 00:02:45.377 LINK hello_world 00:02:45.377 LINK hotplug 00:02:45.637 LINK arbitration 00:02:45.637 LINK reconnect 00:02:45.637 LINK cuse 00:02:45.637 LINK abort 00:02:45.896 LINK hello_blob 00:02:45.896 LINK bdevio 00:02:45.896 LINK accel_perf 00:02:45.896 LINK blobcli 00:02:46.155 LINK nvme_manage 00:02:46.414 CC examples/bdev/hello_world/hello_bdev.o 00:02:46.414 CC examples/bdev/bdevperf/bdevperf.o 00:02:46.673 LINK hello_bdev 00:02:47.239 LINK bdevperf 00:02:48.172 CC examples/nvmf/nvmf/nvmf.o 00:02:48.172 LINK nvmf 00:02:49.545 LINK esnap 00:02:50.113 00:02:50.113 real 1m39.897s 00:02:50.113 user 18m12.726s 00:02:50.113 sys 4m24.149s 00:02:50.113 22:31:34 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:50.113 22:31:34 make -- common/autotest_common.sh@10 -- $ set +x 00:02:50.113 ************************************ 00:02:50.113 END TEST make 00:02:50.113 ************************************ 00:02:50.113 22:31:34 -- common/autotest_common.sh@1142 -- $ return 0 00:02:50.113 22:31:34 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:50.113 22:31:34 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:50.113 22:31:34 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:50.113 22:31:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:50.113 22:31:34 -- pm/common@44 -- $ pid=2534041 00:02:50.113 22:31:34 -- pm/common@50 -- $ kill -TERM 2534041 00:02:50.113 22:31:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:50.113 22:31:34 -- pm/common@44 -- $ pid=2534042 00:02:50.113 22:31:34 -- pm/common@50 -- $ kill -TERM 2534042 00:02:50.113 22:31:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:50.113 22:31:34 -- pm/common@44 -- $ pid=2534045 00:02:50.113 22:31:34 -- pm/common@50 -- $ kill -TERM 2534045 00:02:50.113 22:31:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:50.113 22:31:34 -- pm/common@44 -- $ pid=2534069 00:02:50.113 22:31:34 -- pm/common@50 -- $ sudo -E kill -TERM 2534069 00:02:50.113 22:31:34 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:50.113 22:31:34 -- nvmf/common.sh@7 -- # uname -s 00:02:50.113 22:31:34 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:50.113 22:31:34 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:50.113 22:31:34 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:50.113 22:31:34 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:50.113 22:31:34 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:50.113 22:31:34 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:50.113 22:31:34 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:50.113 22:31:34 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:50.113 22:31:34 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:50.113 22:31:34 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:50.113 22:31:34 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:02:50.113 22:31:34 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:02:50.113 22:31:34 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:50.113 22:31:34 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:50.113 22:31:34 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:50.113 22:31:34 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:50.113 22:31:34 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:50.113 22:31:34 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:50.113 22:31:34 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:50.113 22:31:34 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:50.113 22:31:34 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:50.113 22:31:34 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:50.113 22:31:34 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:50.113 22:31:34 -- paths/export.sh@5 -- # export PATH 00:02:50.113 22:31:34 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:50.113 22:31:34 -- nvmf/common.sh@47 -- # : 0 00:02:50.113 22:31:34 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:50.113 22:31:34 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:50.113 22:31:34 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:50.113 22:31:34 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:50.113 22:31:34 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:50.113 22:31:34 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:50.113 22:31:34 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:50.113 22:31:34 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:50.113 22:31:34 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:50.113 22:31:34 -- spdk/autotest.sh@32 -- # uname -s 00:02:50.113 22:31:34 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:50.113 22:31:34 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:50.113 22:31:34 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:50.113 22:31:34 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:50.113 22:31:34 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:50.113 22:31:34 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:50.113 22:31:34 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:50.113 22:31:34 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:50.113 22:31:34 -- spdk/autotest.sh@48 -- # udevadm_pid=2602126 00:02:50.113 22:31:34 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:50.113 22:31:34 -- pm/common@17 -- # local monitor 00:02:50.113 22:31:34 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:50.113 22:31:34 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:50.113 22:31:34 -- pm/common@21 -- # date +%s 00:02:50.113 22:31:34 -- pm/common@25 -- # sleep 1 00:02:50.113 22:31:34 -- pm/common@21 -- # date +%s 00:02:50.113 22:31:34 -- pm/common@21 -- # date +%s 00:02:50.113 22:31:34 -- pm/common@21 -- # date +%s 00:02:50.113 22:31:34 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721075494 00:02:50.113 22:31:34 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721075494 00:02:50.113 22:31:34 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721075494 00:02:50.113 22:31:34 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721075494 00:02:50.113 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721075494_collect-vmstat.pm.log 00:02:50.113 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721075494_collect-cpu-load.pm.log 00:02:50.113 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721075494_collect-cpu-temp.pm.log 00:02:50.372 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721075494_collect-bmc-pm.bmc.pm.log 00:02:51.309 22:31:35 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:51.309 22:31:35 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:51.309 22:31:35 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:51.309 22:31:35 -- common/autotest_common.sh@10 -- # set +x 00:02:51.309 22:31:35 -- spdk/autotest.sh@59 -- # create_test_list 00:02:51.309 22:31:35 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:51.309 22:31:35 -- common/autotest_common.sh@10 -- # set +x 00:02:51.309 22:31:36 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:51.309 22:31:36 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:51.309 22:31:36 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:51.309 22:31:36 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:51.309 22:31:36 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:51.309 22:31:36 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:51.309 22:31:36 -- common/autotest_common.sh@1455 -- # uname 00:02:51.309 22:31:36 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:51.309 22:31:36 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:51.309 22:31:36 -- common/autotest_common.sh@1475 -- # uname 00:02:51.309 22:31:36 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:51.309 22:31:36 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:51.309 22:31:36 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:51.309 22:31:36 -- spdk/autotest.sh@72 -- # hash lcov 00:02:51.309 22:31:36 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:51.309 22:31:36 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:51.309 --rc lcov_branch_coverage=1 00:02:51.309 --rc lcov_function_coverage=1 00:02:51.309 --rc genhtml_branch_coverage=1 00:02:51.309 --rc genhtml_function_coverage=1 00:02:51.309 --rc genhtml_legend=1 00:02:51.309 --rc geninfo_all_blocks=1 00:02:51.309 ' 00:02:51.309 22:31:36 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:51.309 --rc lcov_branch_coverage=1 00:02:51.309 --rc lcov_function_coverage=1 00:02:51.309 --rc genhtml_branch_coverage=1 00:02:51.309 --rc genhtml_function_coverage=1 00:02:51.309 --rc genhtml_legend=1 00:02:51.309 --rc geninfo_all_blocks=1 00:02:51.309 ' 00:02:51.309 22:31:36 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:51.309 --rc lcov_branch_coverage=1 00:02:51.309 --rc lcov_function_coverage=1 00:02:51.309 --rc genhtml_branch_coverage=1 00:02:51.309 --rc genhtml_function_coverage=1 00:02:51.309 --rc genhtml_legend=1 00:02:51.309 --rc geninfo_all_blocks=1 00:02:51.309 --no-external' 00:02:51.309 22:31:36 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:51.309 --rc lcov_branch_coverage=1 00:02:51.309 --rc lcov_function_coverage=1 00:02:51.309 --rc genhtml_branch_coverage=1 00:02:51.309 --rc genhtml_function_coverage=1 00:02:51.309 --rc genhtml_legend=1 00:02:51.309 --rc geninfo_all_blocks=1 00:02:51.309 --no-external' 00:02:51.309 22:31:36 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:51.309 lcov: LCOV version 1.14 00:02:51.309 22:31:36 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:23.423 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:23.423 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:50.065 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:50.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:50.066 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:50.066 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:50.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:50.067 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:04:00.045 22:32:43 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:04:00.046 22:32:43 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:00.046 22:32:43 -- common/autotest_common.sh@10 -- # set +x 00:04:00.046 22:32:43 -- spdk/autotest.sh@91 -- # rm -f 00:04:00.046 22:32:43 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:03.333 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:03.333 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:03.333 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:04:03.333 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:03.333 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:03.334 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:03.334 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:03.334 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:03.334 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:03.334 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:03.334 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:03.334 22:32:48 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:04:03.334 22:32:48 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:03.334 22:32:48 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:03.334 22:32:48 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:03.334 22:32:48 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:03.334 22:32:48 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:03.334 22:32:48 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:03.334 22:32:48 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:03.334 22:32:48 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:03.334 22:32:48 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:04:03.334 22:32:48 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:04:03.334 22:32:48 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:04:03.334 22:32:48 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:04:03.334 22:32:48 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:04:03.334 22:32:48 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:03.591 No valid GPT data, bailing 00:04:03.591 22:32:48 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:03.591 22:32:48 -- scripts/common.sh@391 -- # pt= 00:04:03.591 22:32:48 -- scripts/common.sh@392 -- # return 1 00:04:03.591 22:32:48 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:03.591 1+0 records in 00:04:03.591 1+0 records out 00:04:03.591 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00312911 s, 335 MB/s 00:04:03.591 22:32:48 -- spdk/autotest.sh@118 -- # sync 00:04:03.591 22:32:48 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:03.591 22:32:48 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:03.591 22:32:48 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:08.870 22:32:53 -- spdk/autotest.sh@124 -- # uname -s 00:04:08.870 22:32:53 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:08.870 22:32:53 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:08.870 22:32:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:08.870 22:32:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:08.870 22:32:53 -- common/autotest_common.sh@10 -- # set +x 00:04:08.870 ************************************ 00:04:08.870 START TEST setup.sh 00:04:08.870 ************************************ 00:04:08.870 22:32:53 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:08.870 * Looking for test storage... 00:04:08.870 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:08.870 22:32:53 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:08.870 22:32:53 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:08.870 22:32:53 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:08.870 22:32:53 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:08.870 22:32:53 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:08.870 22:32:53 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:08.870 ************************************ 00:04:08.870 START TEST acl 00:04:08.870 ************************************ 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:08.870 * Looking for test storage... 00:04:08.870 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:08.870 22:32:53 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:08.870 22:32:53 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:08.870 22:32:53 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:08.870 22:32:53 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:08.870 22:32:53 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:08.870 22:32:53 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:08.870 22:32:53 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:08.870 22:32:53 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:08.870 22:32:53 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:13.065 22:32:57 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:13.065 22:32:57 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:13.065 22:32:57 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.065 22:32:57 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:13.065 22:32:57 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.065 22:32:57 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 Hugepages 00:04:16.397 node hugesize free / total 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 00:04:16.397 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.397 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:16.680 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:16.681 22:33:01 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:16.681 22:33:01 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:16.681 22:33:01 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:16.681 22:33:01 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:16.681 ************************************ 00:04:16.681 START TEST denied 00:04:16.681 ************************************ 00:04:16.681 22:33:01 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:04:16.681 22:33:01 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:16.681 22:33:01 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:16.681 22:33:01 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:16.681 22:33:01 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.681 22:33:01 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:20.877 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:20.877 22:33:05 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.149 00:04:26.150 real 0m9.375s 00:04:26.150 user 0m3.024s 00:04:26.150 sys 0m5.661s 00:04:26.150 22:33:10 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:26.150 22:33:10 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:26.150 ************************************ 00:04:26.150 END TEST denied 00:04:26.150 ************************************ 00:04:26.150 22:33:10 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:26.150 22:33:10 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:26.150 22:33:10 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:26.150 22:33:10 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:26.150 22:33:10 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:26.150 ************************************ 00:04:26.150 START TEST allowed 00:04:26.150 ************************************ 00:04:26.150 22:33:10 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:26.150 22:33:10 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:26.150 22:33:10 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:26.150 22:33:10 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.150 22:33:10 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:26.150 22:33:10 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:32.723 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:32.723 22:33:17 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:32.723 22:33:17 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:32.723 22:33:17 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:32.723 22:33:17 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:32.723 22:33:17 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:36.914 00:04:36.914 real 0m10.568s 00:04:36.914 user 0m2.763s 00:04:36.914 sys 0m5.321s 00:04:36.914 22:33:21 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:36.914 22:33:21 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:36.914 ************************************ 00:04:36.914 END TEST allowed 00:04:36.914 ************************************ 00:04:36.914 22:33:21 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:36.914 00:04:36.914 real 0m28.119s 00:04:36.914 user 0m8.731s 00:04:36.914 sys 0m16.499s 00:04:36.914 22:33:21 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:36.914 22:33:21 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:36.914 ************************************ 00:04:36.914 END TEST acl 00:04:36.914 ************************************ 00:04:36.914 22:33:21 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:36.914 22:33:21 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:36.914 22:33:21 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:36.914 22:33:21 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:36.914 22:33:21 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:36.914 ************************************ 00:04:36.914 START TEST hugepages 00:04:36.914 ************************************ 00:04:36.914 22:33:21 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:36.914 * Looking for test storage... 00:04:36.914 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 76611724 kB' 'MemAvailable: 79912144 kB' 'Buffers: 12176 kB' 'Cached: 9591716 kB' 'SwapCached: 0 kB' 'Active: 6659436 kB' 'Inactive: 3456260 kB' 'Active(anon): 6265852 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515080 kB' 'Mapped: 168380 kB' 'Shmem: 5754048 kB' 'KReclaimable: 208440 kB' 'Slab: 542864 kB' 'SReclaimable: 208440 kB' 'SUnreclaim: 334424 kB' 'KernelStack: 16128 kB' 'PageTables: 8164 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7681584 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.914 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:36.915 22:33:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:36.916 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:37.175 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:37.175 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:37.175 22:33:21 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:37.175 22:33:21 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:37.175 22:33:21 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:37.175 22:33:21 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:37.175 ************************************ 00:04:37.175 START TEST default_setup 00:04:37.175 ************************************ 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:37.175 22:33:21 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:41.370 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:41.370 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:41.370 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:41.370 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:43.913 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78724788 kB' 'MemAvailable: 82025192 kB' 'Buffers: 12176 kB' 'Cached: 9591836 kB' 'SwapCached: 0 kB' 'Active: 6677264 kB' 'Inactive: 3456260 kB' 'Active(anon): 6283680 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532740 kB' 'Mapped: 168448 kB' 'Shmem: 5754168 kB' 'KReclaimable: 208408 kB' 'Slab: 541964 kB' 'SReclaimable: 208408 kB' 'SUnreclaim: 333556 kB' 'KernelStack: 16704 kB' 'PageTables: 9336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7698944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201080 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.913 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.914 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78724108 kB' 'MemAvailable: 82024512 kB' 'Buffers: 12176 kB' 'Cached: 9591840 kB' 'SwapCached: 0 kB' 'Active: 6677760 kB' 'Inactive: 3456260 kB' 'Active(anon): 6284176 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533260 kB' 'Mapped: 168476 kB' 'Shmem: 5754172 kB' 'KReclaimable: 208408 kB' 'Slab: 541964 kB' 'SReclaimable: 208408 kB' 'SUnreclaim: 333556 kB' 'KernelStack: 16688 kB' 'PageTables: 9080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7698728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200904 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.915 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.916 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78725356 kB' 'MemAvailable: 82025760 kB' 'Buffers: 12176 kB' 'Cached: 9591856 kB' 'SwapCached: 0 kB' 'Active: 6676564 kB' 'Inactive: 3456260 kB' 'Active(anon): 6282980 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532032 kB' 'Mapped: 168476 kB' 'Shmem: 5754188 kB' 'KReclaimable: 208408 kB' 'Slab: 541952 kB' 'SReclaimable: 208408 kB' 'SUnreclaim: 333544 kB' 'KernelStack: 16192 kB' 'PageTables: 7852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7698984 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200840 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.917 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.918 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:43.919 nr_hugepages=1024 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:43.919 resv_hugepages=0 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:43.919 surplus_hugepages=0 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:43.919 anon_hugepages=0 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78724936 kB' 'MemAvailable: 82025340 kB' 'Buffers: 12176 kB' 'Cached: 9591856 kB' 'SwapCached: 0 kB' 'Active: 6676936 kB' 'Inactive: 3456260 kB' 'Active(anon): 6283352 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532428 kB' 'Mapped: 168476 kB' 'Shmem: 5754188 kB' 'KReclaimable: 208408 kB' 'Slab: 541952 kB' 'SReclaimable: 208408 kB' 'SUnreclaim: 333544 kB' 'KernelStack: 16336 kB' 'PageTables: 8048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7699008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.919 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:43.920 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36543604 kB' 'MemUsed: 11573336 kB' 'SwapCached: 0 kB' 'Active: 5352260 kB' 'Inactive: 3372048 kB' 'Active(anon): 5194356 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8459484 kB' 'Mapped: 74872 kB' 'AnonPages: 267916 kB' 'Shmem: 4929532 kB' 'KernelStack: 8760 kB' 'PageTables: 4696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124220 kB' 'Slab: 334380 kB' 'SReclaimable: 124220 kB' 'SUnreclaim: 210160 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.921 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:43.922 node0=1024 expecting 1024 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:43.922 00:04:43.922 real 0m6.655s 00:04:43.922 user 0m1.688s 00:04:43.922 sys 0m2.709s 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:43.922 22:33:28 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:43.922 ************************************ 00:04:43.922 END TEST default_setup 00:04:43.922 ************************************ 00:04:43.922 22:33:28 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:43.922 22:33:28 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:43.922 22:33:28 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:43.922 22:33:28 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:43.922 22:33:28 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:43.922 ************************************ 00:04:43.922 START TEST per_node_1G_alloc 00:04:43.922 ************************************ 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.922 22:33:28 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:47.214 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:47.214 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:47.214 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:47.214 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:47.214 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78717808 kB' 'MemAvailable: 82018216 kB' 'Buffers: 12176 kB' 'Cached: 9591968 kB' 'SwapCached: 0 kB' 'Active: 6670500 kB' 'Inactive: 3456260 kB' 'Active(anon): 6276916 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525892 kB' 'Mapped: 167504 kB' 'Shmem: 5754300 kB' 'KReclaimable: 208416 kB' 'Slab: 542300 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333884 kB' 'KernelStack: 16208 kB' 'PageTables: 7996 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7691596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.482 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78718172 kB' 'MemAvailable: 82018580 kB' 'Buffers: 12176 kB' 'Cached: 9591972 kB' 'SwapCached: 0 kB' 'Active: 6669816 kB' 'Inactive: 3456260 kB' 'Active(anon): 6276232 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525188 kB' 'Mapped: 167444 kB' 'Shmem: 5754304 kB' 'KReclaimable: 208416 kB' 'Slab: 542300 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333884 kB' 'KernelStack: 16192 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7691616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.483 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.484 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78718172 kB' 'MemAvailable: 82018580 kB' 'Buffers: 12176 kB' 'Cached: 9591972 kB' 'SwapCached: 0 kB' 'Active: 6669492 kB' 'Inactive: 3456260 kB' 'Active(anon): 6275908 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524860 kB' 'Mapped: 167444 kB' 'Shmem: 5754304 kB' 'KReclaimable: 208416 kB' 'Slab: 542300 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333884 kB' 'KernelStack: 16192 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7691640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.485 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.486 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:47.487 nr_hugepages=1024 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:47.487 resv_hugepages=0 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:47.487 surplus_hugepages=0 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:47.487 anon_hugepages=0 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78718172 kB' 'MemAvailable: 82018580 kB' 'Buffers: 12176 kB' 'Cached: 9592032 kB' 'SwapCached: 0 kB' 'Active: 6669864 kB' 'Inactive: 3456260 kB' 'Active(anon): 6276280 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525132 kB' 'Mapped: 167444 kB' 'Shmem: 5754364 kB' 'KReclaimable: 208416 kB' 'Slab: 542300 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333884 kB' 'KernelStack: 16192 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7691660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.487 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.488 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37585180 kB' 'MemUsed: 10531760 kB' 'SwapCached: 0 kB' 'Active: 5349400 kB' 'Inactive: 3372048 kB' 'Active(anon): 5191496 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8459484 kB' 'Mapped: 74648 kB' 'AnonPages: 265100 kB' 'Shmem: 4929532 kB' 'KernelStack: 8696 kB' 'PageTables: 4480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124220 kB' 'Slab: 334856 kB' 'SReclaimable: 124220 kB' 'SUnreclaim: 210636 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.489 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41135348 kB' 'MemUsed: 3041184 kB' 'SwapCached: 0 kB' 'Active: 1321028 kB' 'Inactive: 84212 kB' 'Active(anon): 1085348 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1144744 kB' 'Mapped: 92796 kB' 'AnonPages: 260692 kB' 'Shmem: 824852 kB' 'KernelStack: 7496 kB' 'PageTables: 3444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84196 kB' 'Slab: 207444 kB' 'SReclaimable: 84196 kB' 'SUnreclaim: 123248 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.490 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.491 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:47.492 node0=512 expecting 512 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:47.492 node1=512 expecting 512 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:47.492 00:04:47.492 real 0m3.738s 00:04:47.492 user 0m1.429s 00:04:47.492 sys 0m2.398s 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:47.492 22:33:32 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:47.492 ************************************ 00:04:47.492 END TEST per_node_1G_alloc 00:04:47.492 ************************************ 00:04:47.844 22:33:32 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:47.844 22:33:32 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:47.844 22:33:32 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:47.844 22:33:32 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:47.844 22:33:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:47.844 ************************************ 00:04:47.844 START TEST even_2G_alloc 00:04:47.844 ************************************ 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:47.844 22:33:32 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:51.134 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:51.134 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:51.134 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:51.134 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:51.134 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:51.396 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:51.396 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78721128 kB' 'MemAvailable: 82021536 kB' 'Buffers: 12176 kB' 'Cached: 9592116 kB' 'SwapCached: 0 kB' 'Active: 6671172 kB' 'Inactive: 3456260 kB' 'Active(anon): 6277588 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525832 kB' 'Mapped: 167592 kB' 'Shmem: 5754448 kB' 'KReclaimable: 208416 kB' 'Slab: 542376 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333960 kB' 'KernelStack: 16192 kB' 'PageTables: 8004 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7692008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201032 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.396 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.397 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78721076 kB' 'MemAvailable: 82021484 kB' 'Buffers: 12176 kB' 'Cached: 9592120 kB' 'SwapCached: 0 kB' 'Active: 6670384 kB' 'Inactive: 3456260 kB' 'Active(anon): 6276800 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525548 kB' 'Mapped: 167480 kB' 'Shmem: 5754452 kB' 'KReclaimable: 208416 kB' 'Slab: 542320 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333904 kB' 'KernelStack: 16176 kB' 'PageTables: 7932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7692024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.398 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.399 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78720688 kB' 'MemAvailable: 82021096 kB' 'Buffers: 12176 kB' 'Cached: 9592136 kB' 'SwapCached: 0 kB' 'Active: 6670420 kB' 'Inactive: 3456260 kB' 'Active(anon): 6276836 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525548 kB' 'Mapped: 167480 kB' 'Shmem: 5754468 kB' 'KReclaimable: 208416 kB' 'Slab: 542320 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333904 kB' 'KernelStack: 16176 kB' 'PageTables: 7932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7692044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.400 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:51.401 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:51.401 nr_hugepages=1024 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:51.402 resv_hugepages=0 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:51.402 surplus_hugepages=0 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:51.402 anon_hugepages=0 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.402 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78722052 kB' 'MemAvailable: 82022460 kB' 'Buffers: 12176 kB' 'Cached: 9592160 kB' 'SwapCached: 0 kB' 'Active: 6670720 kB' 'Inactive: 3456260 kB' 'Active(anon): 6277136 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526328 kB' 'Mapped: 167984 kB' 'Shmem: 5754492 kB' 'KReclaimable: 208416 kB' 'Slab: 542320 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 333904 kB' 'KernelStack: 16112 kB' 'PageTables: 7704 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7693456 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.663 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.664 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37587160 kB' 'MemUsed: 10529780 kB' 'SwapCached: 0 kB' 'Active: 5355452 kB' 'Inactive: 3372048 kB' 'Active(anon): 5197548 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8459556 kB' 'Mapped: 75384 kB' 'AnonPages: 271152 kB' 'Shmem: 4929604 kB' 'KernelStack: 8712 kB' 'PageTables: 4556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124220 kB' 'Slab: 334864 kB' 'SReclaimable: 124220 kB' 'SUnreclaim: 210644 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.665 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.666 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41137920 kB' 'MemUsed: 3038612 kB' 'SwapCached: 0 kB' 'Active: 1320496 kB' 'Inactive: 84212 kB' 'Active(anon): 1084816 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1144800 kB' 'Mapped: 92948 kB' 'AnonPages: 259924 kB' 'Shmem: 824908 kB' 'KernelStack: 7416 kB' 'PageTables: 3220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84196 kB' 'Slab: 207456 kB' 'SReclaimable: 84196 kB' 'SUnreclaim: 123260 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.667 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:51.668 node0=512 expecting 512 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:51.668 node1=512 expecting 512 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:51.668 00:04:51.668 real 0m3.968s 00:04:51.668 user 0m1.494s 00:04:51.668 sys 0m2.577s 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:51.668 22:33:36 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:51.668 ************************************ 00:04:51.668 END TEST even_2G_alloc 00:04:51.668 ************************************ 00:04:51.668 22:33:36 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:51.668 22:33:36 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:51.668 22:33:36 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:51.668 22:33:36 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:51.668 22:33:36 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:51.668 ************************************ 00:04:51.668 START TEST odd_alloc 00:04:51.668 ************************************ 00:04:51.668 22:33:36 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.669 22:33:36 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:55.872 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:55.872 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:55.872 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:55.872 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:55.872 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:55.872 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78701840 kB' 'MemAvailable: 82002248 kB' 'Buffers: 12176 kB' 'Cached: 9592280 kB' 'SwapCached: 0 kB' 'Active: 6671104 kB' 'Inactive: 3456260 kB' 'Active(anon): 6277520 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526276 kB' 'Mapped: 167608 kB' 'Shmem: 5754612 kB' 'KReclaimable: 208416 kB' 'Slab: 542492 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 334076 kB' 'KernelStack: 16208 kB' 'PageTables: 8060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7692972 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.873 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78701988 kB' 'MemAvailable: 82002396 kB' 'Buffers: 12176 kB' 'Cached: 9592284 kB' 'SwapCached: 0 kB' 'Active: 6671212 kB' 'Inactive: 3456260 kB' 'Active(anon): 6277628 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526328 kB' 'Mapped: 167492 kB' 'Shmem: 5754616 kB' 'KReclaimable: 208416 kB' 'Slab: 542476 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 334060 kB' 'KernelStack: 16208 kB' 'PageTables: 8044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7692988 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.874 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.875 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78703092 kB' 'MemAvailable: 82003500 kB' 'Buffers: 12176 kB' 'Cached: 9592300 kB' 'SwapCached: 0 kB' 'Active: 6671220 kB' 'Inactive: 3456260 kB' 'Active(anon): 6277636 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526300 kB' 'Mapped: 167492 kB' 'Shmem: 5754632 kB' 'KReclaimable: 208416 kB' 'Slab: 542476 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 334060 kB' 'KernelStack: 16192 kB' 'PageTables: 7992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7693008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.876 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.877 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:55.878 nr_hugepages=1025 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:55.878 resv_hugepages=0 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:55.878 surplus_hugepages=0 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:55.878 anon_hugepages=0 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78704272 kB' 'MemAvailable: 82004680 kB' 'Buffers: 12176 kB' 'Cached: 9592320 kB' 'SwapCached: 0 kB' 'Active: 6671256 kB' 'Inactive: 3456260 kB' 'Active(anon): 6277672 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526300 kB' 'Mapped: 167492 kB' 'Shmem: 5754652 kB' 'KReclaimable: 208416 kB' 'Slab: 542476 kB' 'SReclaimable: 208416 kB' 'SUnreclaim: 334060 kB' 'KernelStack: 16192 kB' 'PageTables: 7992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7693028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.878 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37556624 kB' 'MemUsed: 10560316 kB' 'SwapCached: 0 kB' 'Active: 5349140 kB' 'Inactive: 3372048 kB' 'Active(anon): 5191236 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8459612 kB' 'Mapped: 74696 kB' 'AnonPages: 264724 kB' 'Shmem: 4929660 kB' 'KernelStack: 8680 kB' 'PageTables: 4436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124220 kB' 'Slab: 334844 kB' 'SReclaimable: 124220 kB' 'SUnreclaim: 210624 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.879 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41147648 kB' 'MemUsed: 3028884 kB' 'SwapCached: 0 kB' 'Active: 1322108 kB' 'Inactive: 84212 kB' 'Active(anon): 1086428 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1144928 kB' 'Mapped: 92796 kB' 'AnonPages: 261464 kB' 'Shmem: 825036 kB' 'KernelStack: 7496 kB' 'PageTables: 3504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84196 kB' 'Slab: 207632 kB' 'SReclaimable: 84196 kB' 'SUnreclaim: 123436 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.880 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.881 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:55.882 node0=512 expecting 513 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:55.882 node1=513 expecting 512 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:55.882 00:04:55.882 real 0m3.959s 00:04:55.882 user 0m1.476s 00:04:55.882 sys 0m2.583s 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:55.882 22:33:40 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:55.882 ************************************ 00:04:55.882 END TEST odd_alloc 00:04:55.882 ************************************ 00:04:55.882 22:33:40 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:55.882 22:33:40 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:55.882 22:33:40 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:55.882 22:33:40 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.882 22:33:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:55.882 ************************************ 00:04:55.882 START TEST custom_alloc 00:04:55.882 ************************************ 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.882 22:33:40 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:59.174 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:59.174 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:59.434 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:04:59.434 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:59.434 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.434 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77674608 kB' 'MemAvailable: 80975000 kB' 'Buffers: 12176 kB' 'Cached: 9592432 kB' 'SwapCached: 0 kB' 'Active: 6672432 kB' 'Inactive: 3456260 kB' 'Active(anon): 6278848 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526920 kB' 'Mapped: 167672 kB' 'Shmem: 5754764 kB' 'KReclaimable: 208384 kB' 'Slab: 542492 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334108 kB' 'KernelStack: 16192 kB' 'PageTables: 7948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7693504 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.435 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77675792 kB' 'MemAvailable: 80976184 kB' 'Buffers: 12176 kB' 'Cached: 9592436 kB' 'SwapCached: 0 kB' 'Active: 6671600 kB' 'Inactive: 3456260 kB' 'Active(anon): 6278016 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526612 kB' 'Mapped: 167508 kB' 'Shmem: 5754768 kB' 'KReclaimable: 208384 kB' 'Slab: 542480 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334096 kB' 'KernelStack: 16176 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7693524 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.436 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.699 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.699 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.699 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.699 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.700 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.701 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77676256 kB' 'MemAvailable: 80976648 kB' 'Buffers: 12176 kB' 'Cached: 9592468 kB' 'SwapCached: 0 kB' 'Active: 6671656 kB' 'Inactive: 3456260 kB' 'Active(anon): 6278072 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526624 kB' 'Mapped: 167508 kB' 'Shmem: 5754800 kB' 'KReclaimable: 208384 kB' 'Slab: 542480 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334096 kB' 'KernelStack: 16176 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7693544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.702 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:59.703 nr_hugepages=1536 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:59.703 resv_hugepages=0 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:59.703 surplus_hugepages=0 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:59.703 anon_hugepages=0 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.703 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77676256 kB' 'MemAvailable: 80976648 kB' 'Buffers: 12176 kB' 'Cached: 9592476 kB' 'SwapCached: 0 kB' 'Active: 6671656 kB' 'Inactive: 3456260 kB' 'Active(anon): 6278072 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526616 kB' 'Mapped: 167508 kB' 'Shmem: 5754808 kB' 'KReclaimable: 208384 kB' 'Slab: 542480 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334096 kB' 'KernelStack: 16176 kB' 'PageTables: 7936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7693568 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.704 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:59.705 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37571600 kB' 'MemUsed: 10545340 kB' 'SwapCached: 0 kB' 'Active: 5350408 kB' 'Inactive: 3372048 kB' 'Active(anon): 5192504 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8459644 kB' 'Mapped: 74712 kB' 'AnonPages: 266052 kB' 'Shmem: 4929692 kB' 'KernelStack: 8712 kB' 'PageTables: 4628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124220 kB' 'Slab: 334948 kB' 'SReclaimable: 124220 kB' 'SUnreclaim: 210728 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.706 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40104788 kB' 'MemUsed: 4071744 kB' 'SwapCached: 0 kB' 'Active: 1321480 kB' 'Inactive: 84212 kB' 'Active(anon): 1085800 kB' 'Inactive(anon): 0 kB' 'Active(file): 235680 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1145044 kB' 'Mapped: 92796 kB' 'AnonPages: 260764 kB' 'Shmem: 825152 kB' 'KernelStack: 7480 kB' 'PageTables: 3404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 84164 kB' 'Slab: 207532 kB' 'SReclaimable: 84164 kB' 'SUnreclaim: 123368 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.707 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:04:59.708 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:59.709 node0=512 expecting 512 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:59.709 node1=1024 expecting 1024 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:59.709 00:04:59.709 real 0m3.977s 00:04:59.709 user 0m1.576s 00:04:59.709 sys 0m2.507s 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:59.709 22:33:44 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:59.709 ************************************ 00:04:59.709 END TEST custom_alloc 00:04:59.709 ************************************ 00:04:59.709 22:33:44 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:59.709 22:33:44 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:59.709 22:33:44 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:59.709 22:33:44 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:59.709 22:33:44 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:59.968 ************************************ 00:04:59.968 START TEST no_shrink_alloc 00:04:59.968 ************************************ 00:04:59.968 22:33:44 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:04:59.968 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:59.968 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:59.968 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.969 22:33:44 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:03.258 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:03.258 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:03.517 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:03.517 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:03.517 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.517 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78714448 kB' 'MemAvailable: 82014840 kB' 'Buffers: 12176 kB' 'Cached: 9592592 kB' 'SwapCached: 0 kB' 'Active: 6672556 kB' 'Inactive: 3456260 kB' 'Active(anon): 6278972 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527384 kB' 'Mapped: 167580 kB' 'Shmem: 5754924 kB' 'KReclaimable: 208384 kB' 'Slab: 542612 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334228 kB' 'KernelStack: 16160 kB' 'PageTables: 7880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694212 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.518 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78715020 kB' 'MemAvailable: 82015412 kB' 'Buffers: 12176 kB' 'Cached: 9592600 kB' 'SwapCached: 0 kB' 'Active: 6673184 kB' 'Inactive: 3456260 kB' 'Active(anon): 6279600 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528040 kB' 'Mapped: 167580 kB' 'Shmem: 5754932 kB' 'KReclaimable: 208384 kB' 'Slab: 542692 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334308 kB' 'KernelStack: 16176 kB' 'PageTables: 7952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.519 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.520 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.782 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.783 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78715616 kB' 'MemAvailable: 82016008 kB' 'Buffers: 12176 kB' 'Cached: 9592612 kB' 'SwapCached: 0 kB' 'Active: 6673192 kB' 'Inactive: 3456260 kB' 'Active(anon): 6279608 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528040 kB' 'Mapped: 167580 kB' 'Shmem: 5754944 kB' 'KReclaimable: 208384 kB' 'Slab: 542692 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334308 kB' 'KernelStack: 16176 kB' 'PageTables: 7952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694252 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.784 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.785 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:03.786 nr_hugepages=1024 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:03.786 resv_hugepages=0 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:03.786 surplus_hugepages=0 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:03.786 anon_hugepages=0 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78717524 kB' 'MemAvailable: 82017916 kB' 'Buffers: 12176 kB' 'Cached: 9592636 kB' 'SwapCached: 0 kB' 'Active: 6672364 kB' 'Inactive: 3456260 kB' 'Active(anon): 6278780 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 527120 kB' 'Mapped: 167520 kB' 'Shmem: 5754968 kB' 'KReclaimable: 208384 kB' 'Slab: 542692 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334308 kB' 'KernelStack: 16176 kB' 'PageTables: 7944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.786 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.787 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:03.788 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36523324 kB' 'MemUsed: 11593616 kB' 'SwapCached: 0 kB' 'Active: 5351992 kB' 'Inactive: 3372048 kB' 'Active(anon): 5194088 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8459676 kB' 'Mapped: 74724 kB' 'AnonPages: 267676 kB' 'Shmem: 4929724 kB' 'KernelStack: 8728 kB' 'PageTables: 4640 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124220 kB' 'Slab: 335096 kB' 'SReclaimable: 124220 kB' 'SUnreclaim: 210876 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.789 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:03.790 node0=1024 expecting 1024 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.790 22:33:48 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:07.982 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:07.982 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:07.982 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:07.982 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:07.982 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.982 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78731476 kB' 'MemAvailable: 82031868 kB' 'Buffers: 12176 kB' 'Cached: 9592720 kB' 'SwapCached: 0 kB' 'Active: 6674644 kB' 'Inactive: 3456260 kB' 'Active(anon): 6281060 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528812 kB' 'Mapped: 167548 kB' 'Shmem: 5755052 kB' 'KReclaimable: 208384 kB' 'Slab: 542544 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334160 kB' 'KernelStack: 16192 kB' 'PageTables: 7972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.982 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.983 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78731224 kB' 'MemAvailable: 82031616 kB' 'Buffers: 12176 kB' 'Cached: 9592720 kB' 'SwapCached: 0 kB' 'Active: 6673844 kB' 'Inactive: 3456260 kB' 'Active(anon): 6280260 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528512 kB' 'Mapped: 167532 kB' 'Shmem: 5755052 kB' 'KReclaimable: 208384 kB' 'Slab: 542636 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334252 kB' 'KernelStack: 16192 kB' 'PageTables: 7984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.984 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.985 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78732452 kB' 'MemAvailable: 82032844 kB' 'Buffers: 12176 kB' 'Cached: 9592740 kB' 'SwapCached: 0 kB' 'Active: 6673860 kB' 'Inactive: 3456260 kB' 'Active(anon): 6280276 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528500 kB' 'Mapped: 167532 kB' 'Shmem: 5755072 kB' 'KReclaimable: 208384 kB' 'Slab: 542636 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334252 kB' 'KernelStack: 16192 kB' 'PageTables: 7984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.986 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:07.987 nr_hugepages=1024 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:07.987 resv_hugepages=0 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:07.987 surplus_hugepages=0 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:07.987 anon_hugepages=0 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.987 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.988 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.988 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78733112 kB' 'MemAvailable: 82033504 kB' 'Buffers: 12176 kB' 'Cached: 9592760 kB' 'SwapCached: 0 kB' 'Active: 6673764 kB' 'Inactive: 3456260 kB' 'Active(anon): 6280180 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 528360 kB' 'Mapped: 167532 kB' 'Shmem: 5755092 kB' 'KReclaimable: 208384 kB' 'Slab: 542636 kB' 'SReclaimable: 208384 kB' 'SUnreclaim: 334252 kB' 'KernelStack: 16176 kB' 'PageTables: 7932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7694608 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 56960 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 1013156 kB' 'DirectMap2M: 17537024 kB' 'DirectMap1G: 82837504 kB' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.020 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.021 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36530296 kB' 'MemUsed: 11586644 kB' 'SwapCached: 0 kB' 'Active: 5352140 kB' 'Inactive: 3372048 kB' 'Active(anon): 5194236 kB' 'Inactive(anon): 0 kB' 'Active(file): 157904 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8459700 kB' 'Mapped: 74736 kB' 'AnonPages: 267632 kB' 'Shmem: 4929748 kB' 'KernelStack: 8680 kB' 'PageTables: 4476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124220 kB' 'Slab: 334896 kB' 'SReclaimable: 124220 kB' 'SUnreclaim: 210676 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.022 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.023 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.024 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:08.025 node0=1024 expecting 1024 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:08.025 00:05:08.025 real 0m7.853s 00:05:08.025 user 0m3.066s 00:05:08.025 sys 0m4.994s 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:08.025 22:33:52 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:08.025 ************************************ 00:05:08.025 END TEST no_shrink_alloc 00:05:08.025 ************************************ 00:05:08.025 22:33:52 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:08.025 22:33:52 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:08.025 00:05:08.025 real 0m30.860s 00:05:08.025 user 0m11.034s 00:05:08.025 sys 0m18.227s 00:05:08.025 22:33:52 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:08.025 22:33:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:08.025 ************************************ 00:05:08.025 END TEST hugepages 00:05:08.025 ************************************ 00:05:08.025 22:33:52 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:08.025 22:33:52 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:08.025 22:33:52 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:08.025 22:33:52 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:08.025 22:33:52 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:08.025 ************************************ 00:05:08.025 START TEST driver 00:05:08.025 ************************************ 00:05:08.025 22:33:52 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:08.025 * Looking for test storage... 00:05:08.025 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:08.025 22:33:52 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:08.025 22:33:52 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:08.025 22:33:52 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:13.295 22:33:57 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:13.295 22:33:57 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:13.295 22:33:57 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:13.295 22:33:57 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:13.295 ************************************ 00:05:13.295 START TEST guess_driver 00:05:13.295 ************************************ 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:13.295 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:13.295 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:13.295 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:13.295 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:13.295 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:13.295 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:13.295 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:13.296 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:13.296 Looking for driver=vfio-pci 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:13.296 22:33:57 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:17.526 22:34:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.063 22:34:04 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.063 22:34:04 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.063 22:34:04 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.063 22:34:04 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:20.063 22:34:04 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:20.063 22:34:04 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:20.063 22:34:04 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:25.336 00:05:25.336 real 0m11.687s 00:05:25.336 user 0m3.106s 00:05:25.336 sys 0m5.499s 00:05:25.336 22:34:09 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.336 22:34:09 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:25.336 ************************************ 00:05:25.336 END TEST guess_driver 00:05:25.336 ************************************ 00:05:25.336 22:34:09 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:25.336 00:05:25.336 real 0m17.063s 00:05:25.336 user 0m4.699s 00:05:25.336 sys 0m8.486s 00:05:25.336 22:34:09 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:25.336 22:34:09 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:25.336 ************************************ 00:05:25.336 END TEST driver 00:05:25.336 ************************************ 00:05:25.336 22:34:09 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:25.336 22:34:09 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:25.336 22:34:09 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:25.336 22:34:09 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:25.336 22:34:09 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:25.336 ************************************ 00:05:25.336 START TEST devices 00:05:25.336 ************************************ 00:05:25.336 22:34:09 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:25.336 * Looking for test storage... 00:05:25.336 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:25.336 22:34:09 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:25.336 22:34:09 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:25.336 22:34:09 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:25.336 22:34:09 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:29.529 22:34:14 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:29.529 No valid GPT data, bailing 00:05:29.529 22:34:14 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:29.529 22:34:14 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:29.529 22:34:14 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:29.529 22:34:14 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.529 22:34:14 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:29.529 ************************************ 00:05:29.529 START TEST nvme_mount 00:05:29.529 ************************************ 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:29.529 22:34:14 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:30.467 Creating new GPT entries in memory. 00:05:30.467 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:30.467 other utilities. 00:05:30.467 22:34:15 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:30.467 22:34:15 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:30.467 22:34:15 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:30.467 22:34:15 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:30.467 22:34:15 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:31.404 Creating new GPT entries in memory. 00:05:31.404 The operation has completed successfully. 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 2640232 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:31.404 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:31.662 22:34:16 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:34.952 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:34.952 22:34:19 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:35.213 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:35.213 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:35.213 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:35.213 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:35.213 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:35.213 22:34:20 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:35.213 22:34:20 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:35.213 22:34:20 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:35.213 22:34:20 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:35.213 22:34:20 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:35.473 22:34:20 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.783 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.784 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.784 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.784 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.784 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:38.784 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:38.784 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:39.042 22:34:23 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:43.230 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:43.230 00:05:43.230 real 0m13.332s 00:05:43.230 user 0m3.830s 00:05:43.230 sys 0m7.483s 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:43.230 22:34:27 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:43.230 ************************************ 00:05:43.230 END TEST nvme_mount 00:05:43.230 ************************************ 00:05:43.230 22:34:27 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:43.230 22:34:27 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:43.230 22:34:27 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:43.230 22:34:27 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:43.230 22:34:27 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:43.230 ************************************ 00:05:43.230 START TEST dm_mount 00:05:43.230 ************************************ 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:43.230 22:34:27 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:43.796 Creating new GPT entries in memory. 00:05:43.796 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:43.796 other utilities. 00:05:43.796 22:34:28 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:43.796 22:34:28 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:43.796 22:34:28 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:43.796 22:34:28 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:43.796 22:34:28 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:44.805 Creating new GPT entries in memory. 00:05:44.805 The operation has completed successfully. 00:05:44.805 22:34:29 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:44.805 22:34:29 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:44.805 22:34:29 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:44.805 22:34:29 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:44.805 22:34:29 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:45.744 The operation has completed successfully. 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 2644377 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:46.004 22:34:30 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:50.199 22:34:34 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:53.490 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:53.490 00:05:53.490 real 0m10.664s 00:05:53.490 user 0m2.780s 00:05:53.490 sys 0m5.020s 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:53.490 22:34:38 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:53.490 ************************************ 00:05:53.490 END TEST dm_mount 00:05:53.490 ************************************ 00:05:53.490 22:34:38 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:53.490 22:34:38 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:53.490 22:34:38 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:53.490 22:34:38 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:53.490 22:34:38 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:53.490 22:34:38 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:53.490 22:34:38 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:53.490 22:34:38 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:53.749 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:53.749 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:05:53.749 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:53.749 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:53.749 22:34:38 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:53.749 22:34:38 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:53.749 22:34:38 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:53.749 22:34:38 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:53.749 22:34:38 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:53.749 22:34:38 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:53.749 22:34:38 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:53.749 00:05:53.749 real 0m28.829s 00:05:53.749 user 0m8.320s 00:05:53.749 sys 0m15.563s 00:05:53.749 22:34:38 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:53.749 22:34:38 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:53.749 ************************************ 00:05:53.749 END TEST devices 00:05:53.749 ************************************ 00:05:53.749 22:34:38 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:53.749 00:05:53.749 real 1m45.330s 00:05:53.749 user 0m32.964s 00:05:53.749 sys 0m59.093s 00:05:53.749 22:34:38 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:53.749 22:34:38 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:53.749 ************************************ 00:05:53.749 END TEST setup.sh 00:05:53.749 ************************************ 00:05:54.009 22:34:38 -- common/autotest_common.sh@1142 -- # return 0 00:05:54.009 22:34:38 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:58.205 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:58.205 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:58.205 Hugepages 00:05:58.205 node hugesize free / total 00:05:58.205 node0 1048576kB 0 / 0 00:05:58.205 node0 2048kB 1024 / 1024 00:05:58.205 node1 1048576kB 0 / 0 00:05:58.205 node1 2048kB 1024 / 1024 00:05:58.205 00:05:58.205 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:58.205 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:58.205 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:58.205 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:58.205 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:58.205 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:58.205 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:58.205 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:58.205 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:58.205 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:05:58.205 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:58.205 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:58.205 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:58.205 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:58.205 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:58.205 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:58.205 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:58.205 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:58.205 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:05:58.205 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:05:58.205 22:34:42 -- spdk/autotest.sh@130 -- # uname -s 00:05:58.205 22:34:42 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:58.205 22:34:42 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:58.205 22:34:42 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:01.496 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:01.496 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:01.496 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:01.496 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:01.756 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:01.756 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:01.756 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:01.756 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:01.756 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:01.756 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:01.756 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:04.293 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:04.293 22:34:49 -- common/autotest_common.sh@1532 -- # sleep 1 00:06:05.229 22:34:50 -- common/autotest_common.sh@1533 -- # bdfs=() 00:06:05.229 22:34:50 -- common/autotest_common.sh@1533 -- # local bdfs 00:06:05.229 22:34:50 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:06:05.229 22:34:50 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:06:05.229 22:34:50 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:05.229 22:34:50 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:05.229 22:34:50 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:05.229 22:34:50 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:05.229 22:34:50 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:05.487 22:34:50 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:05.487 22:34:50 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:05.487 22:34:50 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:08.772 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:08.772 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:09.031 Waiting for block devices as requested 00:06:09.031 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:06:09.031 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:09.290 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:09.290 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:09.290 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:09.549 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:09.549 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:09.549 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:09.808 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:09.808 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:09.808 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:10.068 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:10.068 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:10.068 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:10.328 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:10.328 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:10.328 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:10.608 22:34:55 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:06:10.608 22:34:55 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:06:10.608 22:34:55 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:06:10.608 22:34:55 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:06:10.608 22:34:55 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1545 -- # grep oacs 00:06:10.608 22:34:55 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:06:10.608 22:34:55 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:06:10.608 22:34:55 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:06:10.608 22:34:55 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:06:10.608 22:34:55 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:06:10.608 22:34:55 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:06:10.608 22:34:55 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:06:10.608 22:34:55 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:06:10.608 22:34:55 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:06:10.608 22:34:55 -- common/autotest_common.sh@1557 -- # continue 00:06:10.608 22:34:55 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:06:10.608 22:34:55 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:10.608 22:34:55 -- common/autotest_common.sh@10 -- # set +x 00:06:10.608 22:34:55 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:06:10.608 22:34:55 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:10.608 22:34:55 -- common/autotest_common.sh@10 -- # set +x 00:06:10.608 22:34:55 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:14.833 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:14.833 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:14.833 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:14.833 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:17.368 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:17.368 22:35:01 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:17.368 22:35:01 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:17.368 22:35:01 -- common/autotest_common.sh@10 -- # set +x 00:06:17.368 22:35:01 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:17.368 22:35:01 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:17.368 22:35:01 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:17.368 22:35:01 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:17.368 22:35:01 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:17.368 22:35:01 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:17.368 22:35:01 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:17.368 22:35:01 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:17.368 22:35:01 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:17.368 22:35:01 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:17.368 22:35:01 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:17.368 22:35:02 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:17.368 22:35:02 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:17.368 22:35:02 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:17.368 22:35:02 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:17.368 22:35:02 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:06:17.368 22:35:02 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:06:17.368 22:35:02 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:06:17.368 22:35:02 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:06:17.368 22:35:02 -- common/autotest_common.sh@1593 -- # return 0 00:06:17.368 22:35:02 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:17.368 22:35:02 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:17.368 22:35:02 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:17.368 22:35:02 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:17.368 22:35:02 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:17.936 Restarting all devices. 00:06:22.129 lstat() error: No such file or directory 00:06:22.129 QAT Error: No GENERAL section found 00:06:22.129 Failed to configure qat_dev0 00:06:22.129 lstat() error: No such file or directory 00:06:22.129 QAT Error: No GENERAL section found 00:06:22.129 Failed to configure qat_dev1 00:06:22.129 lstat() error: No such file or directory 00:06:22.129 QAT Error: No GENERAL section found 00:06:22.129 Failed to configure qat_dev2 00:06:22.129 enable sriov 00:06:22.129 Checking status of all devices. 00:06:22.129 There is 3 QAT acceleration device(s) in the system: 00:06:22.129 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:22.129 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:22.129 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:23.066 0000:3d:00.0 set to 16 VFs 00:06:24.439 0000:3f:00.0 set to 16 VFs 00:06:25.810 0000:da:00.0 set to 16 VFs 00:06:29.096 Properly configured the qat device with driver uio_pci_generic. 00:06:29.096 22:35:13 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:29.096 22:35:13 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:29.096 22:35:13 -- common/autotest_common.sh@10 -- # set +x 00:06:29.096 22:35:13 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:29.096 22:35:13 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:29.096 22:35:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:29.096 22:35:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.096 22:35:13 -- common/autotest_common.sh@10 -- # set +x 00:06:29.096 ************************************ 00:06:29.096 START TEST env 00:06:29.096 ************************************ 00:06:29.096 22:35:13 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:29.096 * Looking for test storage... 00:06:29.096 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:29.096 22:35:13 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:29.096 22:35:13 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:29.096 22:35:13 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.096 22:35:13 env -- common/autotest_common.sh@10 -- # set +x 00:06:29.096 ************************************ 00:06:29.096 START TEST env_memory 00:06:29.096 ************************************ 00:06:29.096 22:35:13 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:29.096 00:06:29.096 00:06:29.096 CUnit - A unit testing framework for C - Version 2.1-3 00:06:29.096 http://cunit.sourceforge.net/ 00:06:29.096 00:06:29.096 00:06:29.096 Suite: memory 00:06:29.096 Test: alloc and free memory map ...[2024-07-15 22:35:13.837257] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:29.096 passed 00:06:29.096 Test: mem map translation ...[2024-07-15 22:35:13.866626] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:29.096 [2024-07-15 22:35:13.866654] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:29.096 [2024-07-15 22:35:13.866709] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:29.096 [2024-07-15 22:35:13.866724] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:29.096 passed 00:06:29.096 Test: mem map registration ...[2024-07-15 22:35:13.924527] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:29.096 [2024-07-15 22:35:13.924550] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:29.096 passed 00:06:29.096 Test: mem map adjacent registrations ...passed 00:06:29.096 00:06:29.096 Run Summary: Type Total Ran Passed Failed Inactive 00:06:29.096 suites 1 1 n/a 0 0 00:06:29.096 tests 4 4 4 0 0 00:06:29.096 asserts 152 152 152 0 n/a 00:06:29.096 00:06:29.096 Elapsed time = 0.200 seconds 00:06:29.096 00:06:29.096 real 0m0.215s 00:06:29.096 user 0m0.201s 00:06:29.096 sys 0m0.013s 00:06:29.096 22:35:14 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:29.096 22:35:14 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:29.096 ************************************ 00:06:29.096 END TEST env_memory 00:06:29.096 ************************************ 00:06:29.357 22:35:14 env -- common/autotest_common.sh@1142 -- # return 0 00:06:29.357 22:35:14 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:29.357 22:35:14 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:29.357 22:35:14 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.358 22:35:14 env -- common/autotest_common.sh@10 -- # set +x 00:06:29.358 ************************************ 00:06:29.358 START TEST env_vtophys 00:06:29.358 ************************************ 00:06:29.358 22:35:14 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:29.358 EAL: lib.eal log level changed from notice to debug 00:06:29.358 EAL: Detected lcore 0 as core 0 on socket 0 00:06:29.358 EAL: Detected lcore 1 as core 1 on socket 0 00:06:29.358 EAL: Detected lcore 2 as core 2 on socket 0 00:06:29.358 EAL: Detected lcore 3 as core 3 on socket 0 00:06:29.358 EAL: Detected lcore 4 as core 4 on socket 0 00:06:29.358 EAL: Detected lcore 5 as core 8 on socket 0 00:06:29.358 EAL: Detected lcore 6 as core 9 on socket 0 00:06:29.358 EAL: Detected lcore 7 as core 10 on socket 0 00:06:29.358 EAL: Detected lcore 8 as core 11 on socket 0 00:06:29.358 EAL: Detected lcore 9 as core 16 on socket 0 00:06:29.358 EAL: Detected lcore 10 as core 17 on socket 0 00:06:29.358 EAL: Detected lcore 11 as core 18 on socket 0 00:06:29.358 EAL: Detected lcore 12 as core 19 on socket 0 00:06:29.358 EAL: Detected lcore 13 as core 20 on socket 0 00:06:29.358 EAL: Detected lcore 14 as core 24 on socket 0 00:06:29.358 EAL: Detected lcore 15 as core 25 on socket 0 00:06:29.358 EAL: Detected lcore 16 as core 26 on socket 0 00:06:29.358 EAL: Detected lcore 17 as core 27 on socket 0 00:06:29.358 EAL: Detected lcore 18 as core 0 on socket 1 00:06:29.358 EAL: Detected lcore 19 as core 1 on socket 1 00:06:29.358 EAL: Detected lcore 20 as core 2 on socket 1 00:06:29.358 EAL: Detected lcore 21 as core 3 on socket 1 00:06:29.358 EAL: Detected lcore 22 as core 4 on socket 1 00:06:29.358 EAL: Detected lcore 23 as core 8 on socket 1 00:06:29.358 EAL: Detected lcore 24 as core 9 on socket 1 00:06:29.358 EAL: Detected lcore 25 as core 10 on socket 1 00:06:29.358 EAL: Detected lcore 26 as core 11 on socket 1 00:06:29.358 EAL: Detected lcore 27 as core 16 on socket 1 00:06:29.358 EAL: Detected lcore 28 as core 17 on socket 1 00:06:29.358 EAL: Detected lcore 29 as core 18 on socket 1 00:06:29.358 EAL: Detected lcore 30 as core 19 on socket 1 00:06:29.358 EAL: Detected lcore 31 as core 20 on socket 1 00:06:29.358 EAL: Detected lcore 32 as core 24 on socket 1 00:06:29.358 EAL: Detected lcore 33 as core 25 on socket 1 00:06:29.358 EAL: Detected lcore 34 as core 26 on socket 1 00:06:29.358 EAL: Detected lcore 35 as core 27 on socket 1 00:06:29.358 EAL: Detected lcore 36 as core 0 on socket 0 00:06:29.358 EAL: Detected lcore 37 as core 1 on socket 0 00:06:29.358 EAL: Detected lcore 38 as core 2 on socket 0 00:06:29.358 EAL: Detected lcore 39 as core 3 on socket 0 00:06:29.358 EAL: Detected lcore 40 as core 4 on socket 0 00:06:29.358 EAL: Detected lcore 41 as core 8 on socket 0 00:06:29.358 EAL: Detected lcore 42 as core 9 on socket 0 00:06:29.358 EAL: Detected lcore 43 as core 10 on socket 0 00:06:29.358 EAL: Detected lcore 44 as core 11 on socket 0 00:06:29.358 EAL: Detected lcore 45 as core 16 on socket 0 00:06:29.358 EAL: Detected lcore 46 as core 17 on socket 0 00:06:29.358 EAL: Detected lcore 47 as core 18 on socket 0 00:06:29.358 EAL: Detected lcore 48 as core 19 on socket 0 00:06:29.358 EAL: Detected lcore 49 as core 20 on socket 0 00:06:29.358 EAL: Detected lcore 50 as core 24 on socket 0 00:06:29.358 EAL: Detected lcore 51 as core 25 on socket 0 00:06:29.358 EAL: Detected lcore 52 as core 26 on socket 0 00:06:29.358 EAL: Detected lcore 53 as core 27 on socket 0 00:06:29.358 EAL: Detected lcore 54 as core 0 on socket 1 00:06:29.358 EAL: Detected lcore 55 as core 1 on socket 1 00:06:29.358 EAL: Detected lcore 56 as core 2 on socket 1 00:06:29.358 EAL: Detected lcore 57 as core 3 on socket 1 00:06:29.358 EAL: Detected lcore 58 as core 4 on socket 1 00:06:29.358 EAL: Detected lcore 59 as core 8 on socket 1 00:06:29.358 EAL: Detected lcore 60 as core 9 on socket 1 00:06:29.358 EAL: Detected lcore 61 as core 10 on socket 1 00:06:29.358 EAL: Detected lcore 62 as core 11 on socket 1 00:06:29.358 EAL: Detected lcore 63 as core 16 on socket 1 00:06:29.358 EAL: Detected lcore 64 as core 17 on socket 1 00:06:29.358 EAL: Detected lcore 65 as core 18 on socket 1 00:06:29.358 EAL: Detected lcore 66 as core 19 on socket 1 00:06:29.358 EAL: Detected lcore 67 as core 20 on socket 1 00:06:29.358 EAL: Detected lcore 68 as core 24 on socket 1 00:06:29.358 EAL: Detected lcore 69 as core 25 on socket 1 00:06:29.358 EAL: Detected lcore 70 as core 26 on socket 1 00:06:29.358 EAL: Detected lcore 71 as core 27 on socket 1 00:06:29.358 EAL: Maximum logical cores by configuration: 128 00:06:29.358 EAL: Detected CPU lcores: 72 00:06:29.358 EAL: Detected NUMA nodes: 2 00:06:29.358 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:29.358 EAL: Detected shared linkage of DPDK 00:06:29.358 EAL: No shared files mode enabled, IPC will be disabled 00:06:29.358 EAL: No shared files mode enabled, IPC is disabled 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:29.358 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:29.358 EAL: Bus pci wants IOVA as 'PA' 00:06:29.358 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:29.358 EAL: Bus vdev wants IOVA as 'DC' 00:06:29.358 EAL: Selected IOVA mode 'PA' 00:06:29.358 EAL: Probing VFIO support... 00:06:29.358 EAL: IOMMU type 1 (Type 1) is supported 00:06:29.358 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:29.358 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:29.358 EAL: VFIO support initialized 00:06:29.358 EAL: Ask a virtual area of 0x2e000 bytes 00:06:29.358 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:29.358 EAL: Setting up physically contiguous memory... 00:06:29.358 EAL: Setting maximum number of open files to 524288 00:06:29.358 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:29.358 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:29.358 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:29.358 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.358 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:29.358 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:29.358 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.358 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:29.358 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:29.358 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.358 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:29.358 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:29.358 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.358 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:29.358 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:29.358 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.358 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:29.358 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:29.358 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.358 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:29.358 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:29.359 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.359 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:29.359 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:29.359 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.359 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:29.359 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:29.359 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:29.359 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.359 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:29.359 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:29.359 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.359 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:29.359 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:29.359 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.359 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:29.359 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:29.359 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.359 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:29.359 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:29.359 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.359 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:29.359 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:29.359 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.359 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:29.359 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:29.359 EAL: Ask a virtual area of 0x61000 bytes 00:06:29.359 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:29.359 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:29.359 EAL: Ask a virtual area of 0x400000000 bytes 00:06:29.359 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:29.359 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:29.359 EAL: Hugepages will be freed exactly as allocated. 00:06:29.359 EAL: No shared files mode enabled, IPC is disabled 00:06:29.359 EAL: No shared files mode enabled, IPC is disabled 00:06:29.359 EAL: TSC frequency is ~2300000 KHz 00:06:29.359 EAL: Main lcore 0 is ready (tid=7fca9f9e1b00;cpuset=[0]) 00:06:29.359 EAL: Trying to obtain current memory policy. 00:06:29.359 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.359 EAL: Restoring previous memory policy: 0 00:06:29.359 EAL: request: mp_malloc_sync 00:06:29.359 EAL: No shared files mode enabled, IPC is disabled 00:06:29.359 EAL: Heap on socket 0 was expanded by 2MB 00:06:29.359 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001000000 00:06:29.359 EAL: PCI memory mapped at 0x202001001000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001002000 00:06:29.359 EAL: PCI memory mapped at 0x202001003000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001004000 00:06:29.359 EAL: PCI memory mapped at 0x202001005000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001006000 00:06:29.359 EAL: PCI memory mapped at 0x202001007000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001008000 00:06:29.359 EAL: PCI memory mapped at 0x202001009000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x20200100a000 00:06:29.359 EAL: PCI memory mapped at 0x20200100b000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x20200100c000 00:06:29.359 EAL: PCI memory mapped at 0x20200100d000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x20200100e000 00:06:29.359 EAL: PCI memory mapped at 0x20200100f000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001010000 00:06:29.359 EAL: PCI memory mapped at 0x202001011000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001012000 00:06:29.359 EAL: PCI memory mapped at 0x202001013000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001014000 00:06:29.359 EAL: PCI memory mapped at 0x202001015000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001016000 00:06:29.359 EAL: PCI memory mapped at 0x202001017000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001018000 00:06:29.359 EAL: PCI memory mapped at 0x202001019000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x20200101a000 00:06:29.359 EAL: PCI memory mapped at 0x20200101b000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x20200101c000 00:06:29.359 EAL: PCI memory mapped at 0x20200101d000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:29.359 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x20200101e000 00:06:29.359 EAL: PCI memory mapped at 0x20200101f000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:29.359 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001020000 00:06:29.359 EAL: PCI memory mapped at 0x202001021000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:29.359 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001022000 00:06:29.359 EAL: PCI memory mapped at 0x202001023000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:29.359 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:29.359 EAL: probe driver: 8086:37c9 qat 00:06:29.359 EAL: PCI memory mapped at 0x202001024000 00:06:29.359 EAL: PCI memory mapped at 0x202001025000 00:06:29.359 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:29.359 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001026000 00:06:29.360 EAL: PCI memory mapped at 0x202001027000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001028000 00:06:29.360 EAL: PCI memory mapped at 0x202001029000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200102a000 00:06:29.360 EAL: PCI memory mapped at 0x20200102b000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200102c000 00:06:29.360 EAL: PCI memory mapped at 0x20200102d000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200102e000 00:06:29.360 EAL: PCI memory mapped at 0x20200102f000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001030000 00:06:29.360 EAL: PCI memory mapped at 0x202001031000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001032000 00:06:29.360 EAL: PCI memory mapped at 0x202001033000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001034000 00:06:29.360 EAL: PCI memory mapped at 0x202001035000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001036000 00:06:29.360 EAL: PCI memory mapped at 0x202001037000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001038000 00:06:29.360 EAL: PCI memory mapped at 0x202001039000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200103a000 00:06:29.360 EAL: PCI memory mapped at 0x20200103b000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200103c000 00:06:29.360 EAL: PCI memory mapped at 0x20200103d000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:29.360 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200103e000 00:06:29.360 EAL: PCI memory mapped at 0x20200103f000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:29.360 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001040000 00:06:29.360 EAL: PCI memory mapped at 0x202001041000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:29.360 EAL: Trying to obtain current memory policy. 00:06:29.360 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:29.360 EAL: Restoring previous memory policy: 4 00:06:29.360 EAL: request: mp_malloc_sync 00:06:29.360 EAL: No shared files mode enabled, IPC is disabled 00:06:29.360 EAL: Heap on socket 1 was expanded by 2MB 00:06:29.360 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001042000 00:06:29.360 EAL: PCI memory mapped at 0x202001043000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001044000 00:06:29.360 EAL: PCI memory mapped at 0x202001045000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001046000 00:06:29.360 EAL: PCI memory mapped at 0x202001047000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001048000 00:06:29.360 EAL: PCI memory mapped at 0x202001049000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200104a000 00:06:29.360 EAL: PCI memory mapped at 0x20200104b000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200104c000 00:06:29.360 EAL: PCI memory mapped at 0x20200104d000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200104e000 00:06:29.360 EAL: PCI memory mapped at 0x20200104f000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001050000 00:06:29.360 EAL: PCI memory mapped at 0x202001051000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001052000 00:06:29.360 EAL: PCI memory mapped at 0x202001053000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001054000 00:06:29.360 EAL: PCI memory mapped at 0x202001055000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001056000 00:06:29.360 EAL: PCI memory mapped at 0x202001057000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x202001058000 00:06:29.360 EAL: PCI memory mapped at 0x202001059000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200105a000 00:06:29.360 EAL: PCI memory mapped at 0x20200105b000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200105c000 00:06:29.360 EAL: PCI memory mapped at 0x20200105d000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:29.360 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:29.360 EAL: probe driver: 8086:37c9 qat 00:06:29.360 EAL: PCI memory mapped at 0x20200105e000 00:06:29.360 EAL: PCI memory mapped at 0x20200105f000 00:06:29.360 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:29.360 EAL: No shared files mode enabled, IPC is disabled 00:06:29.360 EAL: No shared files mode enabled, IPC is disabled 00:06:29.360 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:29.360 EAL: Mem event callback 'spdk:(nil)' registered 00:06:29.360 00:06:29.360 00:06:29.360 CUnit - A unit testing framework for C - Version 2.1-3 00:06:29.360 http://cunit.sourceforge.net/ 00:06:29.360 00:06:29.360 00:06:29.360 Suite: components_suite 00:06:29.360 Test: vtophys_malloc_test ...passed 00:06:29.360 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:29.360 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.360 EAL: Restoring previous memory policy: 4 00:06:29.360 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.360 EAL: request: mp_malloc_sync 00:06:29.360 EAL: No shared files mode enabled, IPC is disabled 00:06:29.360 EAL: Heap on socket 0 was expanded by 4MB 00:06:29.360 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.360 EAL: request: mp_malloc_sync 00:06:29.360 EAL: No shared files mode enabled, IPC is disabled 00:06:29.360 EAL: Heap on socket 0 was shrunk by 4MB 00:06:29.360 EAL: Trying to obtain current memory policy. 00:06:29.360 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.360 EAL: Restoring previous memory policy: 4 00:06:29.360 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.360 EAL: request: mp_malloc_sync 00:06:29.360 EAL: No shared files mode enabled, IPC is disabled 00:06:29.360 EAL: Heap on socket 0 was expanded by 6MB 00:06:29.360 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.361 EAL: request: mp_malloc_sync 00:06:29.361 EAL: No shared files mode enabled, IPC is disabled 00:06:29.361 EAL: Heap on socket 0 was shrunk by 6MB 00:06:29.361 EAL: Trying to obtain current memory policy. 00:06:29.361 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.361 EAL: Restoring previous memory policy: 4 00:06:29.361 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.361 EAL: request: mp_malloc_sync 00:06:29.361 EAL: No shared files mode enabled, IPC is disabled 00:06:29.361 EAL: Heap on socket 0 was expanded by 10MB 00:06:29.361 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.361 EAL: request: mp_malloc_sync 00:06:29.361 EAL: No shared files mode enabled, IPC is disabled 00:06:29.361 EAL: Heap on socket 0 was shrunk by 10MB 00:06:29.361 EAL: Trying to obtain current memory policy. 00:06:29.361 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.361 EAL: Restoring previous memory policy: 4 00:06:29.361 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.361 EAL: request: mp_malloc_sync 00:06:29.361 EAL: No shared files mode enabled, IPC is disabled 00:06:29.361 EAL: Heap on socket 0 was expanded by 18MB 00:06:29.361 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.361 EAL: request: mp_malloc_sync 00:06:29.361 EAL: No shared files mode enabled, IPC is disabled 00:06:29.361 EAL: Heap on socket 0 was shrunk by 18MB 00:06:29.361 EAL: Trying to obtain current memory policy. 00:06:29.361 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.620 EAL: Restoring previous memory policy: 4 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was expanded by 34MB 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was shrunk by 34MB 00:06:29.620 EAL: Trying to obtain current memory policy. 00:06:29.620 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.620 EAL: Restoring previous memory policy: 4 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was expanded by 66MB 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was shrunk by 66MB 00:06:29.620 EAL: Trying to obtain current memory policy. 00:06:29.620 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.620 EAL: Restoring previous memory policy: 4 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was expanded by 130MB 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was shrunk by 130MB 00:06:29.620 EAL: Trying to obtain current memory policy. 00:06:29.620 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.620 EAL: Restoring previous memory policy: 4 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was expanded by 258MB 00:06:29.620 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.620 EAL: request: mp_malloc_sync 00:06:29.620 EAL: No shared files mode enabled, IPC is disabled 00:06:29.620 EAL: Heap on socket 0 was shrunk by 258MB 00:06:29.620 EAL: Trying to obtain current memory policy. 00:06:29.620 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:29.879 EAL: Restoring previous memory policy: 4 00:06:29.879 EAL: Calling mem event callback 'spdk:(nil)' 00:06:29.879 EAL: request: mp_malloc_sync 00:06:29.879 EAL: No shared files mode enabled, IPC is disabled 00:06:29.879 EAL: Heap on socket 0 was expanded by 514MB 00:06:29.879 EAL: Calling mem event callback 'spdk:(nil)' 00:06:30.138 EAL: request: mp_malloc_sync 00:06:30.138 EAL: No shared files mode enabled, IPC is disabled 00:06:30.138 EAL: Heap on socket 0 was shrunk by 514MB 00:06:30.138 EAL: Trying to obtain current memory policy. 00:06:30.138 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:30.397 EAL: Restoring previous memory policy: 4 00:06:30.397 EAL: Calling mem event callback 'spdk:(nil)' 00:06:30.397 EAL: request: mp_malloc_sync 00:06:30.397 EAL: No shared files mode enabled, IPC is disabled 00:06:30.397 EAL: Heap on socket 0 was expanded by 1026MB 00:06:30.397 EAL: Calling mem event callback 'spdk:(nil)' 00:06:30.657 EAL: request: mp_malloc_sync 00:06:30.657 EAL: No shared files mode enabled, IPC is disabled 00:06:30.657 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:30.657 passed 00:06:30.657 00:06:30.657 Run Summary: Type Total Ran Passed Failed Inactive 00:06:30.657 suites 1 1 n/a 0 0 00:06:30.657 tests 2 2 2 0 0 00:06:30.657 asserts 5603 5603 5603 0 n/a 00:06:30.657 00:06:30.657 Elapsed time = 1.179 seconds 00:06:30.657 EAL: No shared files mode enabled, IPC is disabled 00:06:30.657 EAL: No shared files mode enabled, IPC is disabled 00:06:30.657 EAL: No shared files mode enabled, IPC is disabled 00:06:30.657 00:06:30.657 real 0m1.378s 00:06:30.657 user 0m0.765s 00:06:30.657 sys 0m0.582s 00:06:30.657 22:35:15 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:30.657 22:35:15 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:30.657 ************************************ 00:06:30.657 END TEST env_vtophys 00:06:30.657 ************************************ 00:06:30.657 22:35:15 env -- common/autotest_common.sh@1142 -- # return 0 00:06:30.657 22:35:15 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:30.657 22:35:15 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:30.657 22:35:15 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.657 22:35:15 env -- common/autotest_common.sh@10 -- # set +x 00:06:30.657 ************************************ 00:06:30.657 START TEST env_pci 00:06:30.657 ************************************ 00:06:30.657 22:35:15 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:30.918 00:06:30.918 00:06:30.918 CUnit - A unit testing framework for C - Version 2.1-3 00:06:30.918 http://cunit.sourceforge.net/ 00:06:30.918 00:06:30.918 00:06:30.918 Suite: pci 00:06:30.918 Test: pci_hook ...[2024-07-15 22:35:15.574368] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2655622 has claimed it 00:06:30.918 EAL: Cannot find device (10000:00:01.0) 00:06:30.918 EAL: Failed to attach device on primary process 00:06:30.918 passed 00:06:30.918 00:06:30.918 Run Summary: Type Total Ran Passed Failed Inactive 00:06:30.918 suites 1 1 n/a 0 0 00:06:30.918 tests 1 1 1 0 0 00:06:30.918 asserts 25 25 25 0 n/a 00:06:30.918 00:06:30.918 Elapsed time = 0.044 seconds 00:06:30.918 00:06:30.918 real 0m0.073s 00:06:30.918 user 0m0.025s 00:06:30.918 sys 0m0.048s 00:06:30.918 22:35:15 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:30.918 22:35:15 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:30.918 ************************************ 00:06:30.918 END TEST env_pci 00:06:30.918 ************************************ 00:06:30.918 22:35:15 env -- common/autotest_common.sh@1142 -- # return 0 00:06:30.918 22:35:15 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:30.918 22:35:15 env -- env/env.sh@15 -- # uname 00:06:30.918 22:35:15 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:30.918 22:35:15 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:30.918 22:35:15 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:30.918 22:35:15 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:30.918 22:35:15 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:30.918 22:35:15 env -- common/autotest_common.sh@10 -- # set +x 00:06:30.918 ************************************ 00:06:30.918 START TEST env_dpdk_post_init 00:06:30.918 ************************************ 00:06:30.918 22:35:15 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:30.918 EAL: Detected CPU lcores: 72 00:06:30.918 EAL: Detected NUMA nodes: 2 00:06:30.918 EAL: Detected shared linkage of DPDK 00:06:30.918 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:30.918 EAL: Selected IOVA mode 'PA' 00:06:30.918 EAL: VFIO support initialized 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:30.918 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.918 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:30.918 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.919 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.919 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:30.919 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:30.920 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:30.920 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:30.920 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:31.180 EAL: Using IOMMU type 1 (Type 1) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:31.180 EAL: Ignore mapping IO port bar(1) 00:06:31.180 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:31.439 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:06:31.439 EAL: Ignore mapping IO port bar(1) 00:06:31.439 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:31.439 EAL: Ignore mapping IO port bar(1) 00:06:31.439 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:31.439 EAL: Ignore mapping IO port bar(1) 00:06:31.439 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:31.439 EAL: Ignore mapping IO port bar(1) 00:06:31.439 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:31.439 EAL: Ignore mapping IO port bar(1) 00:06:31.439 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:31.439 EAL: Ignore mapping IO port bar(1) 00:06:31.439 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:31.698 EAL: Ignore mapping IO port bar(1) 00:06:31.698 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:31.698 EAL: Ignore mapping IO port bar(1) 00:06:31.698 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:31.698 EAL: Ignore mapping IO port bar(1) 00:06:31.698 EAL: Ignore mapping IO port bar(5) 00:06:31.698 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:06:31.698 EAL: Ignore mapping IO port bar(1) 00:06:31.698 EAL: Ignore mapping IO port bar(5) 00:06:31.698 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:06:34.232 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:34.232 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:34.492 Starting DPDK initialization... 00:06:34.492 Starting SPDK post initialization... 00:06:34.492 SPDK NVMe probe 00:06:34.492 Attaching to 0000:5e:00.0 00:06:34.492 Attached to 0000:5e:00.0 00:06:34.492 Cleaning up... 00:06:34.492 00:06:34.492 real 0m3.512s 00:06:34.492 user 0m2.467s 00:06:34.492 sys 0m0.601s 00:06:34.492 22:35:19 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.492 22:35:19 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:34.492 ************************************ 00:06:34.492 END TEST env_dpdk_post_init 00:06:34.492 ************************************ 00:06:34.492 22:35:19 env -- common/autotest_common.sh@1142 -- # return 0 00:06:34.492 22:35:19 env -- env/env.sh@26 -- # uname 00:06:34.492 22:35:19 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:34.492 22:35:19 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:34.492 22:35:19 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.492 22:35:19 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.492 22:35:19 env -- common/autotest_common.sh@10 -- # set +x 00:06:34.492 ************************************ 00:06:34.492 START TEST env_mem_callbacks 00:06:34.492 ************************************ 00:06:34.492 22:35:19 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:34.492 EAL: Detected CPU lcores: 72 00:06:34.492 EAL: Detected NUMA nodes: 2 00:06:34.492 EAL: Detected shared linkage of DPDK 00:06:34.492 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:34.492 EAL: Selected IOVA mode 'PA' 00:06:34.492 EAL: VFIO support initialized 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.492 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:34.492 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.492 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.493 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.493 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.493 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.493 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.493 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.493 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.493 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.493 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:34.493 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:34.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.754 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:34.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:34.755 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:34.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:34.755 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:34.755 00:06:34.755 00:06:34.755 CUnit - A unit testing framework for C - Version 2.1-3 00:06:34.755 http://cunit.sourceforge.net/ 00:06:34.755 00:06:34.755 00:06:34.755 Suite: memory 00:06:34.755 Test: test ... 00:06:34.755 register 0x200000200000 2097152 00:06:34.755 register 0x201000a00000 2097152 00:06:34.755 malloc 3145728 00:06:34.755 register 0x200000400000 4194304 00:06:34.755 buf 0x200000500000 len 3145728 PASSED 00:06:34.755 malloc 64 00:06:34.755 buf 0x2000004fff40 len 64 PASSED 00:06:34.755 malloc 4194304 00:06:34.755 register 0x200000800000 6291456 00:06:34.755 buf 0x200000a00000 len 4194304 PASSED 00:06:34.755 free 0x200000500000 3145728 00:06:34.755 free 0x2000004fff40 64 00:06:34.755 unregister 0x200000400000 4194304 PASSED 00:06:34.755 free 0x200000a00000 4194304 00:06:34.755 unregister 0x200000800000 6291456 PASSED 00:06:34.755 malloc 8388608 00:06:34.755 register 0x200000400000 10485760 00:06:34.755 buf 0x200000600000 len 8388608 PASSED 00:06:34.755 free 0x200000600000 8388608 00:06:34.755 unregister 0x200000400000 10485760 PASSED 00:06:34.755 passed 00:06:34.755 00:06:34.755 Run Summary: Type Total Ran Passed Failed Inactive 00:06:34.755 suites 1 1 n/a 0 0 00:06:34.755 tests 1 1 1 0 0 00:06:34.755 asserts 16 16 16 0 n/a 00:06:34.755 00:06:34.755 Elapsed time = 0.006 seconds 00:06:34.755 00:06:34.755 real 0m0.100s 00:06:34.755 user 0m0.031s 00:06:34.755 sys 0m0.068s 00:06:34.755 22:35:19 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.755 22:35:19 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:34.755 ************************************ 00:06:34.755 END TEST env_mem_callbacks 00:06:34.755 ************************************ 00:06:34.755 22:35:19 env -- common/autotest_common.sh@1142 -- # return 0 00:06:34.755 00:06:34.755 real 0m5.820s 00:06:34.755 user 0m3.672s 00:06:34.755 sys 0m1.708s 00:06:34.755 22:35:19 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:34.755 22:35:19 env -- common/autotest_common.sh@10 -- # set +x 00:06:34.755 ************************************ 00:06:34.755 END TEST env 00:06:34.755 ************************************ 00:06:34.755 22:35:19 -- common/autotest_common.sh@1142 -- # return 0 00:06:34.755 22:35:19 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:34.755 22:35:19 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:34.755 22:35:19 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:34.755 22:35:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.755 ************************************ 00:06:34.755 START TEST rpc 00:06:34.755 ************************************ 00:06:34.755 22:35:19 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:34.755 * Looking for test storage... 00:06:34.755 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:34.755 22:35:19 rpc -- rpc/rpc.sh@65 -- # spdk_pid=2656311 00:06:34.755 22:35:19 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:34.755 22:35:19 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:34.755 22:35:19 rpc -- rpc/rpc.sh@67 -- # waitforlisten 2656311 00:06:34.755 22:35:19 rpc -- common/autotest_common.sh@829 -- # '[' -z 2656311 ']' 00:06:34.755 22:35:19 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.755 22:35:19 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.755 22:35:19 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.755 22:35:19 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.755 22:35:19 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.014 [2024-07-15 22:35:19.729731] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:06:35.014 [2024-07-15 22:35:19.729810] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2656311 ] 00:06:35.014 [2024-07-15 22:35:19.862393] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.273 [2024-07-15 22:35:19.974111] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:35.273 [2024-07-15 22:35:19.974164] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2656311' to capture a snapshot of events at runtime. 00:06:35.273 [2024-07-15 22:35:19.974178] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:35.273 [2024-07-15 22:35:19.974191] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:35.273 [2024-07-15 22:35:19.974202] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2656311 for offline analysis/debug. 00:06:35.273 [2024-07-15 22:35:19.974235] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.884 22:35:20 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.884 22:35:20 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:35.884 22:35:20 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:35.884 22:35:20 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:35.884 22:35:20 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:35.884 22:35:20 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:35.884 22:35:20 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:35.884 22:35:20 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:35.884 22:35:20 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:35.884 ************************************ 00:06:35.884 START TEST rpc_integrity 00:06:35.884 ************************************ 00:06:35.884 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:35.884 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:35.884 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:35.884 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:35.884 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:35.884 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:35.884 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:36.158 { 00:06:36.158 "name": "Malloc0", 00:06:36.158 "aliases": [ 00:06:36.158 "9dba679b-0823-4459-94a2-b9a84e7779f9" 00:06:36.158 ], 00:06:36.158 "product_name": "Malloc disk", 00:06:36.158 "block_size": 512, 00:06:36.158 "num_blocks": 16384, 00:06:36.158 "uuid": "9dba679b-0823-4459-94a2-b9a84e7779f9", 00:06:36.158 "assigned_rate_limits": { 00:06:36.158 "rw_ios_per_sec": 0, 00:06:36.158 "rw_mbytes_per_sec": 0, 00:06:36.158 "r_mbytes_per_sec": 0, 00:06:36.158 "w_mbytes_per_sec": 0 00:06:36.158 }, 00:06:36.158 "claimed": false, 00:06:36.158 "zoned": false, 00:06:36.158 "supported_io_types": { 00:06:36.158 "read": true, 00:06:36.158 "write": true, 00:06:36.158 "unmap": true, 00:06:36.158 "flush": true, 00:06:36.158 "reset": true, 00:06:36.158 "nvme_admin": false, 00:06:36.158 "nvme_io": false, 00:06:36.158 "nvme_io_md": false, 00:06:36.158 "write_zeroes": true, 00:06:36.158 "zcopy": true, 00:06:36.158 "get_zone_info": false, 00:06:36.158 "zone_management": false, 00:06:36.158 "zone_append": false, 00:06:36.158 "compare": false, 00:06:36.158 "compare_and_write": false, 00:06:36.158 "abort": true, 00:06:36.158 "seek_hole": false, 00:06:36.158 "seek_data": false, 00:06:36.158 "copy": true, 00:06:36.158 "nvme_iov_md": false 00:06:36.158 }, 00:06:36.158 "memory_domains": [ 00:06:36.158 { 00:06:36.158 "dma_device_id": "system", 00:06:36.158 "dma_device_type": 1 00:06:36.158 }, 00:06:36.158 { 00:06:36.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.158 "dma_device_type": 2 00:06:36.158 } 00:06:36.158 ], 00:06:36.158 "driver_specific": {} 00:06:36.158 } 00:06:36.158 ]' 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 [2024-07-15 22:35:20.887592] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:36.158 [2024-07-15 22:35:20.887633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:36.158 [2024-07-15 22:35:20.887653] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24a4eb0 00:06:36.158 [2024-07-15 22:35:20.887666] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:36.158 [2024-07-15 22:35:20.889200] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:36.158 [2024-07-15 22:35:20.889229] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:36.158 Passthru0 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.158 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.158 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:36.158 { 00:06:36.158 "name": "Malloc0", 00:06:36.158 "aliases": [ 00:06:36.158 "9dba679b-0823-4459-94a2-b9a84e7779f9" 00:06:36.158 ], 00:06:36.158 "product_name": "Malloc disk", 00:06:36.158 "block_size": 512, 00:06:36.158 "num_blocks": 16384, 00:06:36.158 "uuid": "9dba679b-0823-4459-94a2-b9a84e7779f9", 00:06:36.158 "assigned_rate_limits": { 00:06:36.158 "rw_ios_per_sec": 0, 00:06:36.158 "rw_mbytes_per_sec": 0, 00:06:36.158 "r_mbytes_per_sec": 0, 00:06:36.158 "w_mbytes_per_sec": 0 00:06:36.158 }, 00:06:36.158 "claimed": true, 00:06:36.158 "claim_type": "exclusive_write", 00:06:36.158 "zoned": false, 00:06:36.158 "supported_io_types": { 00:06:36.158 "read": true, 00:06:36.158 "write": true, 00:06:36.158 "unmap": true, 00:06:36.158 "flush": true, 00:06:36.158 "reset": true, 00:06:36.158 "nvme_admin": false, 00:06:36.158 "nvme_io": false, 00:06:36.158 "nvme_io_md": false, 00:06:36.158 "write_zeroes": true, 00:06:36.158 "zcopy": true, 00:06:36.158 "get_zone_info": false, 00:06:36.158 "zone_management": false, 00:06:36.158 "zone_append": false, 00:06:36.158 "compare": false, 00:06:36.158 "compare_and_write": false, 00:06:36.158 "abort": true, 00:06:36.158 "seek_hole": false, 00:06:36.158 "seek_data": false, 00:06:36.158 "copy": true, 00:06:36.158 "nvme_iov_md": false 00:06:36.158 }, 00:06:36.158 "memory_domains": [ 00:06:36.158 { 00:06:36.158 "dma_device_id": "system", 00:06:36.158 "dma_device_type": 1 00:06:36.158 }, 00:06:36.158 { 00:06:36.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.158 "dma_device_type": 2 00:06:36.158 } 00:06:36.158 ], 00:06:36.158 "driver_specific": {} 00:06:36.158 }, 00:06:36.158 { 00:06:36.158 "name": "Passthru0", 00:06:36.158 "aliases": [ 00:06:36.158 "ae62d4db-52ae-5bc8-b84d-49547d76baab" 00:06:36.158 ], 00:06:36.158 "product_name": "passthru", 00:06:36.158 "block_size": 512, 00:06:36.159 "num_blocks": 16384, 00:06:36.159 "uuid": "ae62d4db-52ae-5bc8-b84d-49547d76baab", 00:06:36.159 "assigned_rate_limits": { 00:06:36.159 "rw_ios_per_sec": 0, 00:06:36.159 "rw_mbytes_per_sec": 0, 00:06:36.159 "r_mbytes_per_sec": 0, 00:06:36.159 "w_mbytes_per_sec": 0 00:06:36.159 }, 00:06:36.159 "claimed": false, 00:06:36.159 "zoned": false, 00:06:36.159 "supported_io_types": { 00:06:36.159 "read": true, 00:06:36.159 "write": true, 00:06:36.159 "unmap": true, 00:06:36.159 "flush": true, 00:06:36.159 "reset": true, 00:06:36.159 "nvme_admin": false, 00:06:36.159 "nvme_io": false, 00:06:36.159 "nvme_io_md": false, 00:06:36.159 "write_zeroes": true, 00:06:36.159 "zcopy": true, 00:06:36.159 "get_zone_info": false, 00:06:36.159 "zone_management": false, 00:06:36.159 "zone_append": false, 00:06:36.159 "compare": false, 00:06:36.159 "compare_and_write": false, 00:06:36.159 "abort": true, 00:06:36.159 "seek_hole": false, 00:06:36.159 "seek_data": false, 00:06:36.159 "copy": true, 00:06:36.159 "nvme_iov_md": false 00:06:36.159 }, 00:06:36.159 "memory_domains": [ 00:06:36.159 { 00:06:36.159 "dma_device_id": "system", 00:06:36.159 "dma_device_type": 1 00:06:36.159 }, 00:06:36.159 { 00:06:36.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.159 "dma_device_type": 2 00:06:36.159 } 00:06:36.159 ], 00:06:36.159 "driver_specific": { 00:06:36.159 "passthru": { 00:06:36.159 "name": "Passthru0", 00:06:36.159 "base_bdev_name": "Malloc0" 00:06:36.159 } 00:06:36.159 } 00:06:36.159 } 00:06:36.159 ]' 00:06:36.159 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:36.159 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:36.159 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.159 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.159 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 22:35:20 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.159 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:36.159 22:35:20 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:36.159 22:35:21 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:36.159 00:06:36.159 real 0m0.291s 00:06:36.159 user 0m0.180s 00:06:36.159 sys 0m0.054s 00:06:36.159 22:35:21 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:36.159 22:35:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.159 ************************************ 00:06:36.159 END TEST rpc_integrity 00:06:36.159 ************************************ 00:06:36.418 22:35:21 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:36.418 22:35:21 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:36.418 22:35:21 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:36.418 22:35:21 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:36.418 22:35:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.418 ************************************ 00:06:36.418 START TEST rpc_plugins 00:06:36.418 ************************************ 00:06:36.418 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:36.418 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:36.418 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.418 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:36.418 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.418 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:36.418 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:36.418 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.418 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:36.418 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.418 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:36.418 { 00:06:36.418 "name": "Malloc1", 00:06:36.418 "aliases": [ 00:06:36.418 "09c2c36e-a4c4-4619-b586-cd875329e79b" 00:06:36.418 ], 00:06:36.418 "product_name": "Malloc disk", 00:06:36.418 "block_size": 4096, 00:06:36.418 "num_blocks": 256, 00:06:36.418 "uuid": "09c2c36e-a4c4-4619-b586-cd875329e79b", 00:06:36.418 "assigned_rate_limits": { 00:06:36.418 "rw_ios_per_sec": 0, 00:06:36.419 "rw_mbytes_per_sec": 0, 00:06:36.419 "r_mbytes_per_sec": 0, 00:06:36.419 "w_mbytes_per_sec": 0 00:06:36.419 }, 00:06:36.419 "claimed": false, 00:06:36.419 "zoned": false, 00:06:36.419 "supported_io_types": { 00:06:36.419 "read": true, 00:06:36.419 "write": true, 00:06:36.419 "unmap": true, 00:06:36.419 "flush": true, 00:06:36.419 "reset": true, 00:06:36.419 "nvme_admin": false, 00:06:36.419 "nvme_io": false, 00:06:36.419 "nvme_io_md": false, 00:06:36.419 "write_zeroes": true, 00:06:36.419 "zcopy": true, 00:06:36.419 "get_zone_info": false, 00:06:36.419 "zone_management": false, 00:06:36.419 "zone_append": false, 00:06:36.419 "compare": false, 00:06:36.419 "compare_and_write": false, 00:06:36.419 "abort": true, 00:06:36.419 "seek_hole": false, 00:06:36.419 "seek_data": false, 00:06:36.419 "copy": true, 00:06:36.419 "nvme_iov_md": false 00:06:36.419 }, 00:06:36.419 "memory_domains": [ 00:06:36.419 { 00:06:36.419 "dma_device_id": "system", 00:06:36.419 "dma_device_type": 1 00:06:36.419 }, 00:06:36.419 { 00:06:36.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.419 "dma_device_type": 2 00:06:36.419 } 00:06:36.419 ], 00:06:36.419 "driver_specific": {} 00:06:36.419 } 00:06:36.419 ]' 00:06:36.419 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:36.419 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:36.419 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.419 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.419 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:36.419 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:36.419 22:35:21 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:36.419 00:06:36.419 real 0m0.154s 00:06:36.419 user 0m0.091s 00:06:36.419 sys 0m0.027s 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:36.419 22:35:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:36.419 ************************************ 00:06:36.419 END TEST rpc_plugins 00:06:36.419 ************************************ 00:06:36.419 22:35:21 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:36.419 22:35:21 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:36.419 22:35:21 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:36.419 22:35:21 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:36.419 22:35:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.678 ************************************ 00:06:36.678 START TEST rpc_trace_cmd_test 00:06:36.678 ************************************ 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:36.678 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2656311", 00:06:36.678 "tpoint_group_mask": "0x8", 00:06:36.678 "iscsi_conn": { 00:06:36.678 "mask": "0x2", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "scsi": { 00:06:36.678 "mask": "0x4", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "bdev": { 00:06:36.678 "mask": "0x8", 00:06:36.678 "tpoint_mask": "0xffffffffffffffff" 00:06:36.678 }, 00:06:36.678 "nvmf_rdma": { 00:06:36.678 "mask": "0x10", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "nvmf_tcp": { 00:06:36.678 "mask": "0x20", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "ftl": { 00:06:36.678 "mask": "0x40", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "blobfs": { 00:06:36.678 "mask": "0x80", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "dsa": { 00:06:36.678 "mask": "0x200", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "thread": { 00:06:36.678 "mask": "0x400", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "nvme_pcie": { 00:06:36.678 "mask": "0x800", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "iaa": { 00:06:36.678 "mask": "0x1000", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "nvme_tcp": { 00:06:36.678 "mask": "0x2000", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "bdev_nvme": { 00:06:36.678 "mask": "0x4000", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 }, 00:06:36.678 "sock": { 00:06:36.678 "mask": "0x8000", 00:06:36.678 "tpoint_mask": "0x0" 00:06:36.678 } 00:06:36.678 }' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:36.678 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:36.937 22:35:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:36.937 00:06:36.937 real 0m0.242s 00:06:36.937 user 0m0.196s 00:06:36.937 sys 0m0.036s 00:06:36.937 22:35:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:36.937 22:35:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:36.937 ************************************ 00:06:36.937 END TEST rpc_trace_cmd_test 00:06:36.937 ************************************ 00:06:36.937 22:35:21 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:36.937 22:35:21 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:36.937 22:35:21 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:36.937 22:35:21 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:36.937 22:35:21 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:36.937 22:35:21 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:36.937 22:35:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:36.937 ************************************ 00:06:36.937 START TEST rpc_daemon_integrity 00:06:36.937 ************************************ 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.937 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:36.938 { 00:06:36.938 "name": "Malloc2", 00:06:36.938 "aliases": [ 00:06:36.938 "d76e7a2d-03fa-4f9e-af03-c78b8624c0b8" 00:06:36.938 ], 00:06:36.938 "product_name": "Malloc disk", 00:06:36.938 "block_size": 512, 00:06:36.938 "num_blocks": 16384, 00:06:36.938 "uuid": "d76e7a2d-03fa-4f9e-af03-c78b8624c0b8", 00:06:36.938 "assigned_rate_limits": { 00:06:36.938 "rw_ios_per_sec": 0, 00:06:36.938 "rw_mbytes_per_sec": 0, 00:06:36.938 "r_mbytes_per_sec": 0, 00:06:36.938 "w_mbytes_per_sec": 0 00:06:36.938 }, 00:06:36.938 "claimed": false, 00:06:36.938 "zoned": false, 00:06:36.938 "supported_io_types": { 00:06:36.938 "read": true, 00:06:36.938 "write": true, 00:06:36.938 "unmap": true, 00:06:36.938 "flush": true, 00:06:36.938 "reset": true, 00:06:36.938 "nvme_admin": false, 00:06:36.938 "nvme_io": false, 00:06:36.938 "nvme_io_md": false, 00:06:36.938 "write_zeroes": true, 00:06:36.938 "zcopy": true, 00:06:36.938 "get_zone_info": false, 00:06:36.938 "zone_management": false, 00:06:36.938 "zone_append": false, 00:06:36.938 "compare": false, 00:06:36.938 "compare_and_write": false, 00:06:36.938 "abort": true, 00:06:36.938 "seek_hole": false, 00:06:36.938 "seek_data": false, 00:06:36.938 "copy": true, 00:06:36.938 "nvme_iov_md": false 00:06:36.938 }, 00:06:36.938 "memory_domains": [ 00:06:36.938 { 00:06:36.938 "dma_device_id": "system", 00:06:36.938 "dma_device_type": 1 00:06:36.938 }, 00:06:36.938 { 00:06:36.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:36.938 "dma_device_type": 2 00:06:36.938 } 00:06:36.938 ], 00:06:36.938 "driver_specific": {} 00:06:36.938 } 00:06:36.938 ]' 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:36.938 [2024-07-15 22:35:21.826280] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:36.938 [2024-07-15 22:35:21.826317] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:36.938 [2024-07-15 22:35:21.826340] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24a5b20 00:06:36.938 [2024-07-15 22:35:21.826353] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:36.938 [2024-07-15 22:35:21.827739] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:36.938 [2024-07-15 22:35:21.827766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:36.938 Passthru0 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:36.938 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:37.197 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:37.197 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:37.197 { 00:06:37.197 "name": "Malloc2", 00:06:37.197 "aliases": [ 00:06:37.197 "d76e7a2d-03fa-4f9e-af03-c78b8624c0b8" 00:06:37.197 ], 00:06:37.197 "product_name": "Malloc disk", 00:06:37.197 "block_size": 512, 00:06:37.197 "num_blocks": 16384, 00:06:37.197 "uuid": "d76e7a2d-03fa-4f9e-af03-c78b8624c0b8", 00:06:37.197 "assigned_rate_limits": { 00:06:37.197 "rw_ios_per_sec": 0, 00:06:37.197 "rw_mbytes_per_sec": 0, 00:06:37.197 "r_mbytes_per_sec": 0, 00:06:37.197 "w_mbytes_per_sec": 0 00:06:37.197 }, 00:06:37.197 "claimed": true, 00:06:37.197 "claim_type": "exclusive_write", 00:06:37.197 "zoned": false, 00:06:37.197 "supported_io_types": { 00:06:37.197 "read": true, 00:06:37.197 "write": true, 00:06:37.197 "unmap": true, 00:06:37.197 "flush": true, 00:06:37.197 "reset": true, 00:06:37.197 "nvme_admin": false, 00:06:37.197 "nvme_io": false, 00:06:37.197 "nvme_io_md": false, 00:06:37.197 "write_zeroes": true, 00:06:37.197 "zcopy": true, 00:06:37.197 "get_zone_info": false, 00:06:37.197 "zone_management": false, 00:06:37.197 "zone_append": false, 00:06:37.197 "compare": false, 00:06:37.197 "compare_and_write": false, 00:06:37.197 "abort": true, 00:06:37.197 "seek_hole": false, 00:06:37.197 "seek_data": false, 00:06:37.197 "copy": true, 00:06:37.197 "nvme_iov_md": false 00:06:37.197 }, 00:06:37.197 "memory_domains": [ 00:06:37.197 { 00:06:37.197 "dma_device_id": "system", 00:06:37.197 "dma_device_type": 1 00:06:37.197 }, 00:06:37.197 { 00:06:37.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:37.197 "dma_device_type": 2 00:06:37.197 } 00:06:37.197 ], 00:06:37.197 "driver_specific": {} 00:06:37.197 }, 00:06:37.197 { 00:06:37.197 "name": "Passthru0", 00:06:37.197 "aliases": [ 00:06:37.197 "3c8ac451-ed3b-5419-94f2-7fdd4d7650cd" 00:06:37.197 ], 00:06:37.197 "product_name": "passthru", 00:06:37.197 "block_size": 512, 00:06:37.197 "num_blocks": 16384, 00:06:37.197 "uuid": "3c8ac451-ed3b-5419-94f2-7fdd4d7650cd", 00:06:37.197 "assigned_rate_limits": { 00:06:37.197 "rw_ios_per_sec": 0, 00:06:37.197 "rw_mbytes_per_sec": 0, 00:06:37.197 "r_mbytes_per_sec": 0, 00:06:37.197 "w_mbytes_per_sec": 0 00:06:37.197 }, 00:06:37.197 "claimed": false, 00:06:37.197 "zoned": false, 00:06:37.197 "supported_io_types": { 00:06:37.197 "read": true, 00:06:37.197 "write": true, 00:06:37.197 "unmap": true, 00:06:37.197 "flush": true, 00:06:37.197 "reset": true, 00:06:37.197 "nvme_admin": false, 00:06:37.197 "nvme_io": false, 00:06:37.197 "nvme_io_md": false, 00:06:37.197 "write_zeroes": true, 00:06:37.197 "zcopy": true, 00:06:37.197 "get_zone_info": false, 00:06:37.197 "zone_management": false, 00:06:37.197 "zone_append": false, 00:06:37.197 "compare": false, 00:06:37.197 "compare_and_write": false, 00:06:37.197 "abort": true, 00:06:37.197 "seek_hole": false, 00:06:37.197 "seek_data": false, 00:06:37.197 "copy": true, 00:06:37.197 "nvme_iov_md": false 00:06:37.197 }, 00:06:37.197 "memory_domains": [ 00:06:37.197 { 00:06:37.197 "dma_device_id": "system", 00:06:37.197 "dma_device_type": 1 00:06:37.197 }, 00:06:37.197 { 00:06:37.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:37.197 "dma_device_type": 2 00:06:37.197 } 00:06:37.197 ], 00:06:37.197 "driver_specific": { 00:06:37.197 "passthru": { 00:06:37.197 "name": "Passthru0", 00:06:37.197 "base_bdev_name": "Malloc2" 00:06:37.197 } 00:06:37.197 } 00:06:37.197 } 00:06:37.197 ]' 00:06:37.197 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:37.198 22:35:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:37.198 22:35:22 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:37.198 00:06:37.198 real 0m0.331s 00:06:37.198 user 0m0.215s 00:06:37.198 sys 0m0.057s 00:06:37.198 22:35:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:37.198 22:35:22 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:37.198 ************************************ 00:06:37.198 END TEST rpc_daemon_integrity 00:06:37.198 ************************************ 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:37.198 22:35:22 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:37.198 22:35:22 rpc -- rpc/rpc.sh@84 -- # killprocess 2656311 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@948 -- # '[' -z 2656311 ']' 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@952 -- # kill -0 2656311 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@953 -- # uname 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2656311 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2656311' 00:06:37.198 killing process with pid 2656311 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@967 -- # kill 2656311 00:06:37.198 22:35:22 rpc -- common/autotest_common.sh@972 -- # wait 2656311 00:06:37.767 00:06:37.767 real 0m2.945s 00:06:37.767 user 0m3.742s 00:06:37.767 sys 0m0.966s 00:06:37.767 22:35:22 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:37.767 22:35:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:37.767 ************************************ 00:06:37.767 END TEST rpc 00:06:37.767 ************************************ 00:06:37.767 22:35:22 -- common/autotest_common.sh@1142 -- # return 0 00:06:37.767 22:35:22 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:37.767 22:35:22 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:37.767 22:35:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:37.767 22:35:22 -- common/autotest_common.sh@10 -- # set +x 00:06:37.767 ************************************ 00:06:37.767 START TEST skip_rpc 00:06:37.767 ************************************ 00:06:37.767 22:35:22 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:37.767 * Looking for test storage... 00:06:38.025 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:38.025 22:35:22 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:38.025 22:35:22 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:38.025 22:35:22 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:38.025 22:35:22 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:38.025 22:35:22 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:38.025 22:35:22 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.025 ************************************ 00:06:38.025 START TEST skip_rpc 00:06:38.025 ************************************ 00:06:38.025 22:35:22 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:38.025 22:35:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=2656846 00:06:38.025 22:35:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:38.025 22:35:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:38.025 22:35:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:38.025 [2024-07-15 22:35:22.797893] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:06:38.025 [2024-07-15 22:35:22.797971] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2656846 ] 00:06:38.025 [2024-07-15 22:35:22.929535] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.284 [2024-07-15 22:35:23.039476] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 2656846 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 2656846 ']' 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 2656846 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2656846 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2656846' 00:06:43.553 killing process with pid 2656846 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 2656846 00:06:43.553 22:35:27 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 2656846 00:06:43.553 00:06:43.553 real 0m5.425s 00:06:43.553 user 0m5.066s 00:06:43.553 sys 0m0.382s 00:06:43.553 22:35:28 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:43.553 22:35:28 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:43.553 ************************************ 00:06:43.553 END TEST skip_rpc 00:06:43.553 ************************************ 00:06:43.553 22:35:28 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:43.554 22:35:28 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:43.554 22:35:28 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:43.554 22:35:28 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:43.554 22:35:28 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:43.554 ************************************ 00:06:43.554 START TEST skip_rpc_with_json 00:06:43.554 ************************************ 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=2657582 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 2657582 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 2657582 ']' 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:43.554 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:43.554 22:35:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:43.554 [2024-07-15 22:35:28.307120] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:06:43.554 [2024-07-15 22:35:28.307188] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2657582 ] 00:06:43.554 [2024-07-15 22:35:28.438125] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.812 [2024-07-15 22:35:28.544932] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:44.378 [2024-07-15 22:35:29.241476] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:44.378 request: 00:06:44.378 { 00:06:44.378 "trtype": "tcp", 00:06:44.378 "method": "nvmf_get_transports", 00:06:44.378 "req_id": 1 00:06:44.378 } 00:06:44.378 Got JSON-RPC error response 00:06:44.378 response: 00:06:44.378 { 00:06:44.378 "code": -19, 00:06:44.378 "message": "No such device" 00:06:44.378 } 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:44.378 [2024-07-15 22:35:29.253618] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:44.378 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:44.637 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:44.637 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:44.637 { 00:06:44.637 "subsystems": [ 00:06:44.637 { 00:06:44.637 "subsystem": "keyring", 00:06:44.637 "config": [] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "iobuf", 00:06:44.637 "config": [ 00:06:44.637 { 00:06:44.637 "method": "iobuf_set_options", 00:06:44.637 "params": { 00:06:44.637 "small_pool_count": 8192, 00:06:44.637 "large_pool_count": 1024, 00:06:44.637 "small_bufsize": 8192, 00:06:44.637 "large_bufsize": 135168 00:06:44.637 } 00:06:44.637 } 00:06:44.637 ] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "sock", 00:06:44.637 "config": [ 00:06:44.637 { 00:06:44.637 "method": "sock_set_default_impl", 00:06:44.637 "params": { 00:06:44.637 "impl_name": "posix" 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "sock_impl_set_options", 00:06:44.637 "params": { 00:06:44.637 "impl_name": "ssl", 00:06:44.637 "recv_buf_size": 4096, 00:06:44.637 "send_buf_size": 4096, 00:06:44.637 "enable_recv_pipe": true, 00:06:44.637 "enable_quickack": false, 00:06:44.637 "enable_placement_id": 0, 00:06:44.637 "enable_zerocopy_send_server": true, 00:06:44.637 "enable_zerocopy_send_client": false, 00:06:44.637 "zerocopy_threshold": 0, 00:06:44.637 "tls_version": 0, 00:06:44.637 "enable_ktls": false 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "sock_impl_set_options", 00:06:44.637 "params": { 00:06:44.637 "impl_name": "posix", 00:06:44.637 "recv_buf_size": 2097152, 00:06:44.637 "send_buf_size": 2097152, 00:06:44.637 "enable_recv_pipe": true, 00:06:44.637 "enable_quickack": false, 00:06:44.637 "enable_placement_id": 0, 00:06:44.637 "enable_zerocopy_send_server": true, 00:06:44.637 "enable_zerocopy_send_client": false, 00:06:44.637 "zerocopy_threshold": 0, 00:06:44.637 "tls_version": 0, 00:06:44.637 "enable_ktls": false 00:06:44.637 } 00:06:44.637 } 00:06:44.637 ] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "vmd", 00:06:44.637 "config": [] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "accel", 00:06:44.637 "config": [ 00:06:44.637 { 00:06:44.637 "method": "accel_set_options", 00:06:44.637 "params": { 00:06:44.637 "small_cache_size": 128, 00:06:44.637 "large_cache_size": 16, 00:06:44.637 "task_count": 2048, 00:06:44.637 "sequence_count": 2048, 00:06:44.637 "buf_count": 2048 00:06:44.637 } 00:06:44.637 } 00:06:44.637 ] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "bdev", 00:06:44.637 "config": [ 00:06:44.637 { 00:06:44.637 "method": "bdev_set_options", 00:06:44.637 "params": { 00:06:44.637 "bdev_io_pool_size": 65535, 00:06:44.637 "bdev_io_cache_size": 256, 00:06:44.637 "bdev_auto_examine": true, 00:06:44.637 "iobuf_small_cache_size": 128, 00:06:44.637 "iobuf_large_cache_size": 16 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "bdev_raid_set_options", 00:06:44.637 "params": { 00:06:44.637 "process_window_size_kb": 1024 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "bdev_iscsi_set_options", 00:06:44.637 "params": { 00:06:44.637 "timeout_sec": 30 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "bdev_nvme_set_options", 00:06:44.637 "params": { 00:06:44.637 "action_on_timeout": "none", 00:06:44.637 "timeout_us": 0, 00:06:44.637 "timeout_admin_us": 0, 00:06:44.637 "keep_alive_timeout_ms": 10000, 00:06:44.637 "arbitration_burst": 0, 00:06:44.637 "low_priority_weight": 0, 00:06:44.637 "medium_priority_weight": 0, 00:06:44.637 "high_priority_weight": 0, 00:06:44.637 "nvme_adminq_poll_period_us": 10000, 00:06:44.637 "nvme_ioq_poll_period_us": 0, 00:06:44.637 "io_queue_requests": 0, 00:06:44.637 "delay_cmd_submit": true, 00:06:44.637 "transport_retry_count": 4, 00:06:44.637 "bdev_retry_count": 3, 00:06:44.637 "transport_ack_timeout": 0, 00:06:44.637 "ctrlr_loss_timeout_sec": 0, 00:06:44.637 "reconnect_delay_sec": 0, 00:06:44.637 "fast_io_fail_timeout_sec": 0, 00:06:44.637 "disable_auto_failback": false, 00:06:44.637 "generate_uuids": false, 00:06:44.637 "transport_tos": 0, 00:06:44.637 "nvme_error_stat": false, 00:06:44.637 "rdma_srq_size": 0, 00:06:44.637 "io_path_stat": false, 00:06:44.637 "allow_accel_sequence": false, 00:06:44.637 "rdma_max_cq_size": 0, 00:06:44.637 "rdma_cm_event_timeout_ms": 0, 00:06:44.637 "dhchap_digests": [ 00:06:44.637 "sha256", 00:06:44.637 "sha384", 00:06:44.637 "sha512" 00:06:44.637 ], 00:06:44.637 "dhchap_dhgroups": [ 00:06:44.637 "null", 00:06:44.637 "ffdhe2048", 00:06:44.637 "ffdhe3072", 00:06:44.637 "ffdhe4096", 00:06:44.637 "ffdhe6144", 00:06:44.637 "ffdhe8192" 00:06:44.637 ] 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "bdev_nvme_set_hotplug", 00:06:44.637 "params": { 00:06:44.637 "period_us": 100000, 00:06:44.637 "enable": false 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "bdev_wait_for_examine" 00:06:44.637 } 00:06:44.637 ] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "scsi", 00:06:44.637 "config": null 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "scheduler", 00:06:44.637 "config": [ 00:06:44.637 { 00:06:44.637 "method": "framework_set_scheduler", 00:06:44.637 "params": { 00:06:44.637 "name": "static" 00:06:44.637 } 00:06:44.637 } 00:06:44.637 ] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "vhost_scsi", 00:06:44.637 "config": [] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "vhost_blk", 00:06:44.637 "config": [] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "ublk", 00:06:44.637 "config": [] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "nbd", 00:06:44.637 "config": [] 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "subsystem": "nvmf", 00:06:44.637 "config": [ 00:06:44.637 { 00:06:44.637 "method": "nvmf_set_config", 00:06:44.637 "params": { 00:06:44.637 "discovery_filter": "match_any", 00:06:44.637 "admin_cmd_passthru": { 00:06:44.637 "identify_ctrlr": false 00:06:44.637 } 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "nvmf_set_max_subsystems", 00:06:44.637 "params": { 00:06:44.637 "max_subsystems": 1024 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "nvmf_set_crdt", 00:06:44.637 "params": { 00:06:44.637 "crdt1": 0, 00:06:44.637 "crdt2": 0, 00:06:44.637 "crdt3": 0 00:06:44.637 } 00:06:44.637 }, 00:06:44.637 { 00:06:44.637 "method": "nvmf_create_transport", 00:06:44.637 "params": { 00:06:44.637 "trtype": "TCP", 00:06:44.637 "max_queue_depth": 128, 00:06:44.637 "max_io_qpairs_per_ctrlr": 127, 00:06:44.637 "in_capsule_data_size": 4096, 00:06:44.637 "max_io_size": 131072, 00:06:44.637 "io_unit_size": 131072, 00:06:44.637 "max_aq_depth": 128, 00:06:44.637 "num_shared_buffers": 511, 00:06:44.637 "buf_cache_size": 4294967295, 00:06:44.637 "dif_insert_or_strip": false, 00:06:44.637 "zcopy": false, 00:06:44.637 "c2h_success": true, 00:06:44.637 "sock_priority": 0, 00:06:44.637 "abort_timeout_sec": 1, 00:06:44.637 "ack_timeout": 0, 00:06:44.637 "data_wr_pool_size": 0 00:06:44.637 } 00:06:44.637 } 00:06:44.638 ] 00:06:44.638 }, 00:06:44.638 { 00:06:44.638 "subsystem": "iscsi", 00:06:44.638 "config": [ 00:06:44.638 { 00:06:44.638 "method": "iscsi_set_options", 00:06:44.638 "params": { 00:06:44.638 "node_base": "iqn.2016-06.io.spdk", 00:06:44.638 "max_sessions": 128, 00:06:44.638 "max_connections_per_session": 2, 00:06:44.638 "max_queue_depth": 64, 00:06:44.638 "default_time2wait": 2, 00:06:44.638 "default_time2retain": 20, 00:06:44.638 "first_burst_length": 8192, 00:06:44.638 "immediate_data": true, 00:06:44.638 "allow_duplicated_isid": false, 00:06:44.638 "error_recovery_level": 0, 00:06:44.638 "nop_timeout": 60, 00:06:44.638 "nop_in_interval": 30, 00:06:44.638 "disable_chap": false, 00:06:44.638 "require_chap": false, 00:06:44.638 "mutual_chap": false, 00:06:44.638 "chap_group": 0, 00:06:44.638 "max_large_datain_per_connection": 64, 00:06:44.638 "max_r2t_per_connection": 4, 00:06:44.638 "pdu_pool_size": 36864, 00:06:44.638 "immediate_data_pool_size": 16384, 00:06:44.638 "data_out_pool_size": 2048 00:06:44.638 } 00:06:44.638 } 00:06:44.638 ] 00:06:44.638 } 00:06:44.638 ] 00:06:44.638 } 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 2657582 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2657582 ']' 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2657582 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2657582 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2657582' 00:06:44.638 killing process with pid 2657582 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2657582 00:06:44.638 22:35:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2657582 00:06:45.204 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=2657769 00:06:45.204 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:45.204 22:35:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:50.469 22:35:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 2657769 00:06:50.469 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2657769 ']' 00:06:50.469 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2657769 00:06:50.469 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:06:50.469 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:50.469 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2657769 00:06:50.470 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:50.470 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:50.470 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2657769' 00:06:50.470 killing process with pid 2657769 00:06:50.470 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2657769 00:06:50.470 22:35:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2657769 00:06:50.470 22:35:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:50.470 22:35:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:50.470 00:06:50.470 real 0m7.099s 00:06:50.470 user 0m6.736s 00:06:50.470 sys 0m0.948s 00:06:50.470 22:35:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.470 22:35:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:50.470 ************************************ 00:06:50.470 END TEST skip_rpc_with_json 00:06:50.470 ************************************ 00:06:50.470 22:35:35 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:50.470 22:35:35 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:50.470 22:35:35 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:50.470 22:35:35 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.470 22:35:35 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.729 ************************************ 00:06:50.729 START TEST skip_rpc_with_delay 00:06:50.729 ************************************ 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:50.729 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:50.730 [2024-07-15 22:35:35.489345] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:50.730 [2024-07-15 22:35:35.489448] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:50.730 00:06:50.730 real 0m0.095s 00:06:50.730 user 0m0.056s 00:06:50.730 sys 0m0.038s 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:50.730 22:35:35 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:50.730 ************************************ 00:06:50.730 END TEST skip_rpc_with_delay 00:06:50.730 ************************************ 00:06:50.730 22:35:35 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:50.730 22:35:35 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:50.730 22:35:35 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:50.730 22:35:35 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:50.730 22:35:35 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:50.730 22:35:35 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:50.730 22:35:35 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.730 ************************************ 00:06:50.730 START TEST exit_on_failed_rpc_init 00:06:50.730 ************************************ 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=2658557 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 2658557 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 2658557 ']' 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.730 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:50.730 22:35:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:50.989 [2024-07-15 22:35:35.661190] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:06:50.989 [2024-07-15 22:35:35.661262] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2658557 ] 00:06:50.989 [2024-07-15 22:35:35.791399] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.989 [2024-07-15 22:35:35.898261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:52.367 22:35:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:52.367 [2024-07-15 22:35:36.930021] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:06:52.367 [2024-07-15 22:35:36.930091] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2658708 ] 00:06:52.367 [2024-07-15 22:35:37.063875] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.367 [2024-07-15 22:35:37.176961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:06:52.367 [2024-07-15 22:35:37.177061] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:52.367 [2024-07-15 22:35:37.177082] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:52.367 [2024-07-15 22:35:37.177098] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 2658557 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 2658557 ']' 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 2658557 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2658557 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2658557' 00:06:52.627 killing process with pid 2658557 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 2658557 00:06:52.627 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 2658557 00:06:52.886 00:06:52.886 real 0m2.135s 00:06:52.886 user 0m2.693s 00:06:52.886 sys 0m0.644s 00:06:52.886 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:52.886 22:35:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:52.886 ************************************ 00:06:52.886 END TEST exit_on_failed_rpc_init 00:06:52.886 ************************************ 00:06:52.886 22:35:37 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:52.886 22:35:37 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:52.886 00:06:52.886 real 0m15.198s 00:06:52.886 user 0m14.694s 00:06:52.886 sys 0m2.349s 00:06:52.886 22:35:37 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:52.886 22:35:37 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:52.886 ************************************ 00:06:52.886 END TEST skip_rpc 00:06:52.886 ************************************ 00:06:53.146 22:35:37 -- common/autotest_common.sh@1142 -- # return 0 00:06:53.146 22:35:37 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:53.146 22:35:37 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:53.146 22:35:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.146 22:35:37 -- common/autotest_common.sh@10 -- # set +x 00:06:53.146 ************************************ 00:06:53.146 START TEST rpc_client 00:06:53.146 ************************************ 00:06:53.146 22:35:37 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:53.146 * Looking for test storage... 00:06:53.146 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:06:53.146 22:35:37 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:53.146 OK 00:06:53.146 22:35:38 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:53.146 00:06:53.146 real 0m0.147s 00:06:53.146 user 0m0.071s 00:06:53.146 sys 0m0.086s 00:06:53.146 22:35:38 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:53.146 22:35:38 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:53.146 ************************************ 00:06:53.146 END TEST rpc_client 00:06:53.146 ************************************ 00:06:53.146 22:35:38 -- common/autotest_common.sh@1142 -- # return 0 00:06:53.146 22:35:38 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:53.146 22:35:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:53.146 22:35:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:53.146 22:35:38 -- common/autotest_common.sh@10 -- # set +x 00:06:53.406 ************************************ 00:06:53.406 START TEST json_config 00:06:53.406 ************************************ 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:53.406 22:35:38 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:53.406 22:35:38 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:53.406 22:35:38 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:53.406 22:35:38 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:53.406 22:35:38 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:53.406 22:35:38 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:53.406 22:35:38 json_config -- paths/export.sh@5 -- # export PATH 00:06:53.406 22:35:38 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@47 -- # : 0 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:53.406 22:35:38 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:06:53.406 INFO: JSON configuration test init 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:53.406 22:35:38 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:06:53.406 22:35:38 json_config -- json_config/common.sh@9 -- # local app=target 00:06:53.406 22:35:38 json_config -- json_config/common.sh@10 -- # shift 00:06:53.406 22:35:38 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:53.406 22:35:38 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:53.406 22:35:38 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:06:53.406 22:35:38 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:53.406 22:35:38 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:53.406 22:35:38 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2658996 00:06:53.406 22:35:38 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:53.406 Waiting for target to run... 00:06:53.406 22:35:38 json_config -- json_config/common.sh@25 -- # waitforlisten 2658996 /var/tmp/spdk_tgt.sock 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@829 -- # '[' -z 2658996 ']' 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:53.406 22:35:38 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:53.406 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:53.406 22:35:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:53.406 [2024-07-15 22:35:38.292575] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:06:53.406 [2024-07-15 22:35:38.292649] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2658996 ] 00:06:53.975 [2024-07-15 22:35:38.665369] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.975 [2024-07-15 22:35:38.756114] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.544 22:35:39 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:54.544 22:35:39 json_config -- common/autotest_common.sh@862 -- # return 0 00:06:54.544 22:35:39 json_config -- json_config/common.sh@26 -- # echo '' 00:06:54.544 00:06:54.544 22:35:39 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:06:54.544 22:35:39 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:06:54.544 22:35:39 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:54.544 22:35:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:54.544 22:35:39 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:06:54.544 22:35:39 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:06:54.544 22:35:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:06:54.803 22:35:39 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:54.803 22:35:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:06:54.803 [2024-07-15 22:35:39.694998] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:06:54.803 22:35:39 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:54.803 22:35:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:06:55.062 [2024-07-15 22:35:39.939616] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:06:55.062 22:35:39 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:06:55.062 22:35:39 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:55.062 22:35:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:55.321 22:35:39 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:06:55.321 22:35:39 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:06:55.321 22:35:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:06:55.580 [2024-07-15 22:35:40.269135] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:06:58.115 22:35:42 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:06:58.116 22:35:42 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:06:58.116 22:35:42 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:58.116 22:35:42 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:58.116 22:35:42 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:06:58.116 22:35:42 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:06:58.116 22:35:42 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:06:58.116 22:35:42 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:06:58.116 22:35:42 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:06:58.116 22:35:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@48 -- # local get_types 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:06:58.374 22:35:43 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:58.374 22:35:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@55 -- # return 0 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:06:58.374 22:35:43 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:06:58.374 22:35:43 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:58.374 22:35:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:06:58.375 22:35:43 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:06:58.375 22:35:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:06:58.633 22:35:43 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:06:58.633 22:35:43 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:06:58.633 22:35:43 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:06:58.633 22:35:43 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:06:58.633 22:35:43 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:06:58.633 22:35:43 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:06:58.633 22:35:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:06:58.891 Nvme0n1p0 Nvme0n1p1 00:06:58.891 22:35:43 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:06:58.891 22:35:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:06:59.150 [2024-07-15 22:35:43.926841] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:59.150 [2024-07-15 22:35:43.926896] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:06:59.150 00:06:59.150 22:35:43 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:06:59.150 22:35:43 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:06:59.421 Malloc3 00:06:59.421 22:35:44 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:59.421 22:35:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:06:59.731 [2024-07-15 22:35:44.420241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:59.731 [2024-07-15 22:35:44.420290] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:59.731 [2024-07-15 22:35:44.420317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2727a00 00:06:59.731 [2024-07-15 22:35:44.420331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:59.731 [2024-07-15 22:35:44.421965] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:59.731 [2024-07-15 22:35:44.421993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:06:59.731 PTBdevFromMalloc3 00:06:59.731 22:35:44 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:06:59.731 22:35:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:06:59.990 Null0 00:06:59.990 22:35:44 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:06:59.990 22:35:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:07:00.248 Malloc0 00:07:00.248 22:35:44 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:07:00.249 22:35:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:07:00.507 Malloc1 00:07:00.507 22:35:45 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:07:00.507 22:35:45 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:07:00.765 102400+0 records in 00:07:00.765 102400+0 records out 00:07:00.765 104857600 bytes (105 MB, 100 MiB) copied, 0.310652 s, 338 MB/s 00:07:00.765 22:35:45 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:07:00.765 22:35:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:07:01.023 aio_disk 00:07:01.023 22:35:45 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:07:01.023 22:35:45 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:01.023 22:35:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:06.292 b009ea94-64d1-4fbf-bcb2-e4162c1c8e0b 00:07:06.292 22:35:50 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:07:06.292 22:35:50 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:07:06.292 22:35:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:07:06.292 22:35:50 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:07:06.292 22:35:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:07:06.551 22:35:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:06.551 22:35:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:06.809 22:35:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:06.809 22:35:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:07.068 22:35:51 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:07:07.068 22:35:51 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:07.068 22:35:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:07.327 MallocForCryptoBdev 00:07:07.327 22:35:52 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:07:07.327 22:35:52 json_config -- json_config/json_config.sh@159 -- # wc -l 00:07:07.327 22:35:52 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:07:07.327 22:35:52 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:07:07.327 22:35:52 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:07.327 22:35:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:07.585 [2024-07-15 22:35:52.262826] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:07:07.586 CryptoMallocBdev 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:3ce40b14-4caa-4ab1-9a68-b06dadd55d20 bdev_register:6aec6337-ed6e-4c14-9c50-c081ee77072d bdev_register:550ccbcc-e2e1-44fa-8719-74046d040fee bdev_register:86455596-4c47-4308-b256-77cf40212fcc bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:3ce40b14-4caa-4ab1-9a68-b06dadd55d20 bdev_register:6aec6337-ed6e-4c14-9c50-c081ee77072d bdev_register:550ccbcc-e2e1-44fa-8719-74046d040fee bdev_register:86455596-4c47-4308-b256-77cf40212fcc bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@71 -- # sort 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@72 -- # sort 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:07:07.586 22:35:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.586 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:3ce40b14-4caa-4ab1-9a68-b06dadd55d20 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:6aec6337-ed6e-4c14-9c50-c081ee77072d 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:550ccbcc-e2e1-44fa-8719-74046d040fee 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:86455596-4c47-4308-b256-77cf40212fcc 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:3ce40b14-4caa-4ab1-9a68-b06dadd55d20 bdev_register:550ccbcc-e2e1-44fa-8719-74046d040fee bdev_register:6aec6337-ed6e-4c14-9c50-c081ee77072d bdev_register:86455596-4c47-4308-b256-77cf40212fcc bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\c\e\4\0\b\1\4\-\4\c\a\a\-\4\a\b\1\-\9\a\6\8\-\b\0\6\d\a\d\d\5\5\d\2\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\5\5\0\c\c\b\c\c\-\e\2\e\1\-\4\4\f\a\-\8\7\1\9\-\7\4\0\4\6\d\0\4\0\f\e\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\a\e\c\6\3\3\7\-\e\d\6\e\-\4\c\1\4\-\9\c\5\0\-\c\0\8\1\e\e\7\7\0\7\2\d\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\6\4\5\5\5\9\6\-\4\c\4\7\-\4\3\0\8\-\b\2\5\6\-\7\7\c\f\4\0\2\1\2\f\c\c\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:07:07.844 22:35:52 json_config -- json_config/json_config.sh@86 -- # cat 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:3ce40b14-4caa-4ab1-9a68-b06dadd55d20 bdev_register:550ccbcc-e2e1-44fa-8719-74046d040fee bdev_register:6aec6337-ed6e-4c14-9c50-c081ee77072d bdev_register:86455596-4c47-4308-b256-77cf40212fcc bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:07:07.845 Expected events matched: 00:07:07.845 bdev_register:3ce40b14-4caa-4ab1-9a68-b06dadd55d20 00:07:07.845 bdev_register:550ccbcc-e2e1-44fa-8719-74046d040fee 00:07:07.845 bdev_register:6aec6337-ed6e-4c14-9c50-c081ee77072d 00:07:07.845 bdev_register:86455596-4c47-4308-b256-77cf40212fcc 00:07:07.845 bdev_register:aio_disk 00:07:07.845 bdev_register:CryptoMallocBdev 00:07:07.845 bdev_register:Malloc0 00:07:07.845 bdev_register:Malloc0p0 00:07:07.845 bdev_register:Malloc0p1 00:07:07.845 bdev_register:Malloc0p2 00:07:07.845 bdev_register:Malloc1 00:07:07.845 bdev_register:Malloc3 00:07:07.845 bdev_register:MallocForCryptoBdev 00:07:07.845 bdev_register:Null0 00:07:07.845 bdev_register:Nvme0n1 00:07:07.845 bdev_register:Nvme0n1p0 00:07:07.845 bdev_register:Nvme0n1p1 00:07:07.845 bdev_register:PTBdevFromMalloc3 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:07:07.845 22:35:52 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:07.845 22:35:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:07:07.845 22:35:52 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:07.845 22:35:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:07:07.845 22:35:52 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:07.845 22:35:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:08.102 MallocBdevForConfigChangeCheck 00:07:08.103 22:35:52 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:07:08.103 22:35:52 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:08.103 22:35:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:08.103 22:35:52 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:07:08.103 22:35:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:08.361 22:35:53 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:07:08.361 INFO: shutting down applications... 00:07:08.361 22:35:53 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:07:08.361 22:35:53 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:07:08.361 22:35:53 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:07:08.361 22:35:53 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:07:08.621 [2024-07-15 22:35:53.394372] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:07:11.907 Calling clear_iscsi_subsystem 00:07:11.907 Calling clear_nvmf_subsystem 00:07:11.907 Calling clear_nbd_subsystem 00:07:11.907 Calling clear_ublk_subsystem 00:07:11.907 Calling clear_vhost_blk_subsystem 00:07:11.907 Calling clear_vhost_scsi_subsystem 00:07:11.907 Calling clear_bdev_subsystem 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@343 -- # count=100 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@345 -- # break 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:07:11.907 22:35:56 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:07:11.907 22:35:56 json_config -- json_config/common.sh@31 -- # local app=target 00:07:11.907 22:35:56 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:11.907 22:35:56 json_config -- json_config/common.sh@35 -- # [[ -n 2658996 ]] 00:07:11.907 22:35:56 json_config -- json_config/common.sh@38 -- # kill -SIGINT 2658996 00:07:11.907 22:35:56 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:11.907 22:35:56 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:11.907 22:35:56 json_config -- json_config/common.sh@41 -- # kill -0 2658996 00:07:11.907 22:35:56 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:07:12.474 22:35:57 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:07:12.474 22:35:57 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:12.474 22:35:57 json_config -- json_config/common.sh@41 -- # kill -0 2658996 00:07:12.474 22:35:57 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:12.474 22:35:57 json_config -- json_config/common.sh@43 -- # break 00:07:12.474 22:35:57 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:12.474 22:35:57 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:12.474 SPDK target shutdown done 00:07:12.474 22:35:57 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:07:12.474 INFO: relaunching applications... 00:07:12.474 22:35:57 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:12.474 22:35:57 json_config -- json_config/common.sh@9 -- # local app=target 00:07:12.474 22:35:57 json_config -- json_config/common.sh@10 -- # shift 00:07:12.474 22:35:57 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:12.474 22:35:57 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:12.474 22:35:57 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:12.474 22:35:57 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:12.474 22:35:57 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:12.474 22:35:57 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2661649 00:07:12.474 22:35:57 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:12.474 Waiting for target to run... 00:07:12.474 22:35:57 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:12.474 22:35:57 json_config -- json_config/common.sh@25 -- # waitforlisten 2661649 /var/tmp/spdk_tgt.sock 00:07:12.474 22:35:57 json_config -- common/autotest_common.sh@829 -- # '[' -z 2661649 ']' 00:07:12.474 22:35:57 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:12.474 22:35:57 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:12.474 22:35:57 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:12.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:12.474 22:35:57 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:12.474 22:35:57 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:12.474 [2024-07-15 22:35:57.321455] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:12.474 [2024-07-15 22:35:57.321537] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2661649 ] 00:07:13.407 [2024-07-15 22:35:57.969817] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.407 [2024-07-15 22:35:58.079319] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.408 [2024-07-15 22:35:58.133516] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:07:13.408 [2024-07-15 22:35:58.141553] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:13.408 [2024-07-15 22:35:58.149571] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:13.408 [2024-07-15 22:35:58.230784] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:15.942 [2024-07-15 22:36:00.434857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:15.942 [2024-07-15 22:36:00.434936] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:15.942 [2024-07-15 22:36:00.434953] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:15.942 [2024-07-15 22:36:00.442874] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:15.942 [2024-07-15 22:36:00.442901] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:15.942 [2024-07-15 22:36:00.450889] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:15.942 [2024-07-15 22:36:00.450912] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:15.942 [2024-07-15 22:36:00.458947] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:07:15.942 [2024-07-15 22:36:00.458976] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:07:15.942 [2024-07-15 22:36:00.458989] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:15.942 [2024-07-15 22:36:00.832903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:15.942 [2024-07-15 22:36:00.832958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:15.942 [2024-07-15 22:36:00.832976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x254cb90 00:07:15.942 [2024-07-15 22:36:00.832989] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:15.942 [2024-07-15 22:36:00.833275] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:15.942 [2024-07-15 22:36:00.833294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:16.201 22:36:00 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:16.201 22:36:00 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:16.201 22:36:00 json_config -- json_config/common.sh@26 -- # echo '' 00:07:16.201 00:07:16.202 22:36:00 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:07:16.202 22:36:00 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:16.202 INFO: Checking if target configuration is the same... 00:07:16.202 22:36:00 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:16.202 22:36:00 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:07:16.202 22:36:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:16.202 + '[' 2 -ne 2 ']' 00:07:16.202 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:16.202 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:16.202 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:16.202 +++ basename /dev/fd/62 00:07:16.202 ++ mktemp /tmp/62.XXX 00:07:16.202 + tmp_file_1=/tmp/62.MgL 00:07:16.202 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:16.202 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:16.202 + tmp_file_2=/tmp/spdk_tgt_config.json.faI 00:07:16.202 + ret=0 00:07:16.202 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:16.461 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:16.719 + diff -u /tmp/62.MgL /tmp/spdk_tgt_config.json.faI 00:07:16.719 + echo 'INFO: JSON config files are the same' 00:07:16.719 INFO: JSON config files are the same 00:07:16.719 + rm /tmp/62.MgL /tmp/spdk_tgt_config.json.faI 00:07:16.719 + exit 0 00:07:16.719 22:36:01 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:16.719 22:36:01 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:16.719 INFO: changing configuration and checking if this can be detected... 00:07:16.719 22:36:01 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:16.719 22:36:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:16.977 22:36:01 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:16.977 22:36:01 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:16.977 22:36:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:16.977 + '[' 2 -ne 2 ']' 00:07:16.977 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:16.977 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:16.977 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:16.977 +++ basename /dev/fd/62 00:07:16.977 ++ mktemp /tmp/62.XXX 00:07:16.977 + tmp_file_1=/tmp/62.JBO 00:07:16.977 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:16.977 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:16.977 + tmp_file_2=/tmp/spdk_tgt_config.json.wgy 00:07:16.977 + ret=0 00:07:16.977 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:17.234 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:17.491 + diff -u /tmp/62.JBO /tmp/spdk_tgt_config.json.wgy 00:07:17.491 + ret=1 00:07:17.491 + echo '=== Start of file: /tmp/62.JBO ===' 00:07:17.491 + cat /tmp/62.JBO 00:07:17.491 + echo '=== End of file: /tmp/62.JBO ===' 00:07:17.491 + echo '' 00:07:17.491 + echo '=== Start of file: /tmp/spdk_tgt_config.json.wgy ===' 00:07:17.491 + cat /tmp/spdk_tgt_config.json.wgy 00:07:17.491 + echo '=== End of file: /tmp/spdk_tgt_config.json.wgy ===' 00:07:17.491 + echo '' 00:07:17.491 + rm /tmp/62.JBO /tmp/spdk_tgt_config.json.wgy 00:07:17.491 + exit 1 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:17.491 INFO: configuration change detected. 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:17.491 22:36:02 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:17.491 22:36:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@317 -- # [[ -n 2661649 ]] 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:17.491 22:36:02 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:17.491 22:36:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:17.491 22:36:02 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:17.491 22:36:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:17.749 22:36:02 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:17.749 22:36:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:18.007 22:36:02 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:18.007 22:36:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:18.576 22:36:03 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:18.576 22:36:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:18.576 22:36:03 json_config -- json_config/json_config.sh@193 -- # uname -s 00:07:18.576 22:36:03 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:07:18.576 22:36:03 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:07:18.576 22:36:03 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:07:18.576 22:36:03 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:07:18.576 22:36:03 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:18.576 22:36:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:18.835 22:36:03 json_config -- json_config/json_config.sh@323 -- # killprocess 2661649 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@948 -- # '[' -z 2661649 ']' 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@952 -- # kill -0 2661649 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@953 -- # uname 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2661649 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2661649' 00:07:18.835 killing process with pid 2661649 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@967 -- # kill 2661649 00:07:18.835 22:36:03 json_config -- common/autotest_common.sh@972 -- # wait 2661649 00:07:22.120 22:36:06 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:22.120 22:36:06 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:22.120 22:36:06 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:22.120 22:36:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:22.120 22:36:06 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:22.120 22:36:06 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:22.120 INFO: Success 00:07:22.120 00:07:22.120 real 0m28.736s 00:07:22.120 user 0m35.370s 00:07:22.120 sys 0m4.069s 00:07:22.120 22:36:06 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:22.120 22:36:06 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:22.120 ************************************ 00:07:22.120 END TEST json_config 00:07:22.120 ************************************ 00:07:22.120 22:36:06 -- common/autotest_common.sh@1142 -- # return 0 00:07:22.120 22:36:06 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:22.120 22:36:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:22.120 22:36:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:22.120 22:36:06 -- common/autotest_common.sh@10 -- # set +x 00:07:22.120 ************************************ 00:07:22.120 START TEST json_config_extra_key 00:07:22.120 ************************************ 00:07:22.120 22:36:06 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:22.120 22:36:06 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:22.120 22:36:06 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:22.120 22:36:07 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:22.120 22:36:07 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:22.120 22:36:07 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:22.120 22:36:07 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.120 22:36:07 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.120 22:36:07 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.120 22:36:07 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:22.120 22:36:07 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:22.120 22:36:07 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:22.120 INFO: launching applications... 00:07:22.120 22:36:07 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=2663612 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:22.120 Waiting for target to run... 00:07:22.120 22:36:07 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 2663612 /var/tmp/spdk_tgt.sock 00:07:22.120 22:36:07 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 2663612 ']' 00:07:22.121 22:36:07 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:22.121 22:36:07 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:22.121 22:36:07 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:22.121 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:22.121 22:36:07 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:22.121 22:36:07 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:22.380 [2024-07-15 22:36:07.096108] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:22.380 [2024-07-15 22:36:07.096188] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2663612 ] 00:07:22.989 [2024-07-15 22:36:07.706209] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.989 [2024-07-15 22:36:07.817676] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.281 22:36:08 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:23.281 22:36:08 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:23.281 00:07:23.281 22:36:08 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:23.281 INFO: shutting down applications... 00:07:23.281 22:36:08 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 2663612 ]] 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 2663612 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2663612 00:07:23.281 22:36:08 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:23.850 22:36:08 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:23.850 22:36:08 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:23.850 22:36:08 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2663612 00:07:23.850 22:36:08 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:23.850 22:36:08 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:23.850 22:36:08 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:23.850 22:36:08 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:23.850 SPDK target shutdown done 00:07:23.850 22:36:08 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:23.850 Success 00:07:23.850 00:07:23.850 real 0m1.672s 00:07:23.850 user 0m1.162s 00:07:23.850 sys 0m0.754s 00:07:23.850 22:36:08 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.850 22:36:08 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:23.850 ************************************ 00:07:23.850 END TEST json_config_extra_key 00:07:23.850 ************************************ 00:07:23.850 22:36:08 -- common/autotest_common.sh@1142 -- # return 0 00:07:23.850 22:36:08 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:23.850 22:36:08 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:23.850 22:36:08 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.850 22:36:08 -- common/autotest_common.sh@10 -- # set +x 00:07:23.850 ************************************ 00:07:23.850 START TEST alias_rpc 00:07:23.850 ************************************ 00:07:23.850 22:36:08 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:24.110 * Looking for test storage... 00:07:24.110 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:24.110 22:36:08 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:24.110 22:36:08 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2663846 00:07:24.110 22:36:08 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2663846 00:07:24.110 22:36:08 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 2663846 ']' 00:07:24.110 22:36:08 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.110 22:36:08 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:24.110 22:36:08 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:24.110 22:36:08 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.110 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.110 22:36:08 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:24.111 22:36:08 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:24.111 [2024-07-15 22:36:08.913854] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:24.111 [2024-07-15 22:36:08.914008] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2663846 ] 00:07:24.370 [2024-07-15 22:36:09.109978] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.370 [2024-07-15 22:36:09.215518] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.630 22:36:09 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:24.630 22:36:09 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:24.630 22:36:09 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:25.197 22:36:10 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2663846 00:07:25.197 22:36:10 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 2663846 ']' 00:07:25.197 22:36:10 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 2663846 00:07:25.197 22:36:10 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:25.197 22:36:10 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:25.198 22:36:10 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2663846 00:07:25.198 22:36:10 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:25.198 22:36:10 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:25.198 22:36:10 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2663846' 00:07:25.198 killing process with pid 2663846 00:07:25.198 22:36:10 alias_rpc -- common/autotest_common.sh@967 -- # kill 2663846 00:07:25.198 22:36:10 alias_rpc -- common/autotest_common.sh@972 -- # wait 2663846 00:07:25.767 00:07:25.767 real 0m1.780s 00:07:25.767 user 0m2.169s 00:07:25.767 sys 0m0.643s 00:07:25.767 22:36:10 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:25.767 22:36:10 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:25.767 ************************************ 00:07:25.767 END TEST alias_rpc 00:07:25.767 ************************************ 00:07:25.767 22:36:10 -- common/autotest_common.sh@1142 -- # return 0 00:07:25.767 22:36:10 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:25.767 22:36:10 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:25.767 22:36:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:25.767 22:36:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:25.767 22:36:10 -- common/autotest_common.sh@10 -- # set +x 00:07:25.767 ************************************ 00:07:25.767 START TEST spdkcli_tcp 00:07:25.767 ************************************ 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:25.767 * Looking for test storage... 00:07:25.767 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2664148 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 2664148 00:07:25.767 22:36:10 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 2664148 ']' 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:25.767 22:36:10 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:26.026 [2024-07-15 22:36:10.725778] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:26.026 [2024-07-15 22:36:10.725857] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2664148 ] 00:07:26.026 [2024-07-15 22:36:10.853032] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:26.285 [2024-07-15 22:36:10.964282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:26.285 [2024-07-15 22:36:10.964287] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.852 22:36:11 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:26.852 22:36:11 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:26.852 22:36:11 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=2664260 00:07:26.852 22:36:11 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:26.853 22:36:11 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:27.112 [ 00:07:27.112 "bdev_malloc_delete", 00:07:27.112 "bdev_malloc_create", 00:07:27.112 "bdev_null_resize", 00:07:27.112 "bdev_null_delete", 00:07:27.112 "bdev_null_create", 00:07:27.112 "bdev_nvme_cuse_unregister", 00:07:27.112 "bdev_nvme_cuse_register", 00:07:27.112 "bdev_opal_new_user", 00:07:27.112 "bdev_opal_set_lock_state", 00:07:27.112 "bdev_opal_delete", 00:07:27.112 "bdev_opal_get_info", 00:07:27.112 "bdev_opal_create", 00:07:27.112 "bdev_nvme_opal_revert", 00:07:27.112 "bdev_nvme_opal_init", 00:07:27.112 "bdev_nvme_send_cmd", 00:07:27.112 "bdev_nvme_get_path_iostat", 00:07:27.112 "bdev_nvme_get_mdns_discovery_info", 00:07:27.112 "bdev_nvme_stop_mdns_discovery", 00:07:27.112 "bdev_nvme_start_mdns_discovery", 00:07:27.112 "bdev_nvme_set_multipath_policy", 00:07:27.112 "bdev_nvme_set_preferred_path", 00:07:27.112 "bdev_nvme_get_io_paths", 00:07:27.112 "bdev_nvme_remove_error_injection", 00:07:27.112 "bdev_nvme_add_error_injection", 00:07:27.112 "bdev_nvme_get_discovery_info", 00:07:27.112 "bdev_nvme_stop_discovery", 00:07:27.112 "bdev_nvme_start_discovery", 00:07:27.112 "bdev_nvme_get_controller_health_info", 00:07:27.112 "bdev_nvme_disable_controller", 00:07:27.112 "bdev_nvme_enable_controller", 00:07:27.112 "bdev_nvme_reset_controller", 00:07:27.112 "bdev_nvme_get_transport_statistics", 00:07:27.112 "bdev_nvme_apply_firmware", 00:07:27.112 "bdev_nvme_detach_controller", 00:07:27.112 "bdev_nvme_get_controllers", 00:07:27.112 "bdev_nvme_attach_controller", 00:07:27.112 "bdev_nvme_set_hotplug", 00:07:27.112 "bdev_nvme_set_options", 00:07:27.112 "bdev_passthru_delete", 00:07:27.112 "bdev_passthru_create", 00:07:27.112 "bdev_lvol_set_parent_bdev", 00:07:27.112 "bdev_lvol_set_parent", 00:07:27.112 "bdev_lvol_check_shallow_copy", 00:07:27.112 "bdev_lvol_start_shallow_copy", 00:07:27.112 "bdev_lvol_grow_lvstore", 00:07:27.112 "bdev_lvol_get_lvols", 00:07:27.112 "bdev_lvol_get_lvstores", 00:07:27.112 "bdev_lvol_delete", 00:07:27.112 "bdev_lvol_set_read_only", 00:07:27.112 "bdev_lvol_resize", 00:07:27.112 "bdev_lvol_decouple_parent", 00:07:27.112 "bdev_lvol_inflate", 00:07:27.112 "bdev_lvol_rename", 00:07:27.112 "bdev_lvol_clone_bdev", 00:07:27.112 "bdev_lvol_clone", 00:07:27.112 "bdev_lvol_snapshot", 00:07:27.112 "bdev_lvol_create", 00:07:27.112 "bdev_lvol_delete_lvstore", 00:07:27.112 "bdev_lvol_rename_lvstore", 00:07:27.112 "bdev_lvol_create_lvstore", 00:07:27.112 "bdev_raid_set_options", 00:07:27.112 "bdev_raid_remove_base_bdev", 00:07:27.112 "bdev_raid_add_base_bdev", 00:07:27.112 "bdev_raid_delete", 00:07:27.112 "bdev_raid_create", 00:07:27.112 "bdev_raid_get_bdevs", 00:07:27.112 "bdev_error_inject_error", 00:07:27.112 "bdev_error_delete", 00:07:27.112 "bdev_error_create", 00:07:27.112 "bdev_split_delete", 00:07:27.112 "bdev_split_create", 00:07:27.112 "bdev_delay_delete", 00:07:27.112 "bdev_delay_create", 00:07:27.112 "bdev_delay_update_latency", 00:07:27.112 "bdev_zone_block_delete", 00:07:27.112 "bdev_zone_block_create", 00:07:27.112 "blobfs_create", 00:07:27.112 "blobfs_detect", 00:07:27.112 "blobfs_set_cache_size", 00:07:27.112 "bdev_crypto_delete", 00:07:27.112 "bdev_crypto_create", 00:07:27.112 "bdev_compress_delete", 00:07:27.112 "bdev_compress_create", 00:07:27.112 "bdev_compress_get_orphans", 00:07:27.112 "bdev_aio_delete", 00:07:27.112 "bdev_aio_rescan", 00:07:27.112 "bdev_aio_create", 00:07:27.112 "bdev_ftl_set_property", 00:07:27.112 "bdev_ftl_get_properties", 00:07:27.112 "bdev_ftl_get_stats", 00:07:27.112 "bdev_ftl_unmap", 00:07:27.112 "bdev_ftl_unload", 00:07:27.112 "bdev_ftl_delete", 00:07:27.112 "bdev_ftl_load", 00:07:27.112 "bdev_ftl_create", 00:07:27.112 "bdev_virtio_attach_controller", 00:07:27.112 "bdev_virtio_scsi_get_devices", 00:07:27.112 "bdev_virtio_detach_controller", 00:07:27.112 "bdev_virtio_blk_set_hotplug", 00:07:27.112 "bdev_iscsi_delete", 00:07:27.112 "bdev_iscsi_create", 00:07:27.112 "bdev_iscsi_set_options", 00:07:27.112 "accel_error_inject_error", 00:07:27.112 "ioat_scan_accel_module", 00:07:27.112 "dsa_scan_accel_module", 00:07:27.112 "iaa_scan_accel_module", 00:07:27.112 "dpdk_cryptodev_get_driver", 00:07:27.112 "dpdk_cryptodev_set_driver", 00:07:27.113 "dpdk_cryptodev_scan_accel_module", 00:07:27.113 "compressdev_scan_accel_module", 00:07:27.113 "keyring_file_remove_key", 00:07:27.113 "keyring_file_add_key", 00:07:27.113 "keyring_linux_set_options", 00:07:27.113 "iscsi_get_histogram", 00:07:27.113 "iscsi_enable_histogram", 00:07:27.113 "iscsi_set_options", 00:07:27.113 "iscsi_get_auth_groups", 00:07:27.113 "iscsi_auth_group_remove_secret", 00:07:27.113 "iscsi_auth_group_add_secret", 00:07:27.113 "iscsi_delete_auth_group", 00:07:27.113 "iscsi_create_auth_group", 00:07:27.113 "iscsi_set_discovery_auth", 00:07:27.113 "iscsi_get_options", 00:07:27.113 "iscsi_target_node_request_logout", 00:07:27.113 "iscsi_target_node_set_redirect", 00:07:27.113 "iscsi_target_node_set_auth", 00:07:27.113 "iscsi_target_node_add_lun", 00:07:27.113 "iscsi_get_stats", 00:07:27.113 "iscsi_get_connections", 00:07:27.113 "iscsi_portal_group_set_auth", 00:07:27.113 "iscsi_start_portal_group", 00:07:27.113 "iscsi_delete_portal_group", 00:07:27.113 "iscsi_create_portal_group", 00:07:27.113 "iscsi_get_portal_groups", 00:07:27.113 "iscsi_delete_target_node", 00:07:27.113 "iscsi_target_node_remove_pg_ig_maps", 00:07:27.113 "iscsi_target_node_add_pg_ig_maps", 00:07:27.113 "iscsi_create_target_node", 00:07:27.113 "iscsi_get_target_nodes", 00:07:27.113 "iscsi_delete_initiator_group", 00:07:27.113 "iscsi_initiator_group_remove_initiators", 00:07:27.113 "iscsi_initiator_group_add_initiators", 00:07:27.113 "iscsi_create_initiator_group", 00:07:27.113 "iscsi_get_initiator_groups", 00:07:27.113 "nvmf_set_crdt", 00:07:27.113 "nvmf_set_config", 00:07:27.113 "nvmf_set_max_subsystems", 00:07:27.113 "nvmf_stop_mdns_prr", 00:07:27.113 "nvmf_publish_mdns_prr", 00:07:27.113 "nvmf_subsystem_get_listeners", 00:07:27.113 "nvmf_subsystem_get_qpairs", 00:07:27.113 "nvmf_subsystem_get_controllers", 00:07:27.113 "nvmf_get_stats", 00:07:27.113 "nvmf_get_transports", 00:07:27.113 "nvmf_create_transport", 00:07:27.113 "nvmf_get_targets", 00:07:27.113 "nvmf_delete_target", 00:07:27.113 "nvmf_create_target", 00:07:27.113 "nvmf_subsystem_allow_any_host", 00:07:27.113 "nvmf_subsystem_remove_host", 00:07:27.113 "nvmf_subsystem_add_host", 00:07:27.113 "nvmf_ns_remove_host", 00:07:27.113 "nvmf_ns_add_host", 00:07:27.113 "nvmf_subsystem_remove_ns", 00:07:27.113 "nvmf_subsystem_add_ns", 00:07:27.113 "nvmf_subsystem_listener_set_ana_state", 00:07:27.113 "nvmf_discovery_get_referrals", 00:07:27.113 "nvmf_discovery_remove_referral", 00:07:27.113 "nvmf_discovery_add_referral", 00:07:27.113 "nvmf_subsystem_remove_listener", 00:07:27.113 "nvmf_subsystem_add_listener", 00:07:27.113 "nvmf_delete_subsystem", 00:07:27.113 "nvmf_create_subsystem", 00:07:27.113 "nvmf_get_subsystems", 00:07:27.113 "env_dpdk_get_mem_stats", 00:07:27.113 "nbd_get_disks", 00:07:27.113 "nbd_stop_disk", 00:07:27.113 "nbd_start_disk", 00:07:27.113 "ublk_recover_disk", 00:07:27.113 "ublk_get_disks", 00:07:27.113 "ublk_stop_disk", 00:07:27.113 "ublk_start_disk", 00:07:27.113 "ublk_destroy_target", 00:07:27.113 "ublk_create_target", 00:07:27.113 "virtio_blk_create_transport", 00:07:27.113 "virtio_blk_get_transports", 00:07:27.113 "vhost_controller_set_coalescing", 00:07:27.113 "vhost_get_controllers", 00:07:27.113 "vhost_delete_controller", 00:07:27.113 "vhost_create_blk_controller", 00:07:27.113 "vhost_scsi_controller_remove_target", 00:07:27.113 "vhost_scsi_controller_add_target", 00:07:27.113 "vhost_start_scsi_controller", 00:07:27.113 "vhost_create_scsi_controller", 00:07:27.113 "thread_set_cpumask", 00:07:27.113 "framework_get_governor", 00:07:27.113 "framework_get_scheduler", 00:07:27.113 "framework_set_scheduler", 00:07:27.113 "framework_get_reactors", 00:07:27.113 "thread_get_io_channels", 00:07:27.113 "thread_get_pollers", 00:07:27.113 "thread_get_stats", 00:07:27.113 "framework_monitor_context_switch", 00:07:27.113 "spdk_kill_instance", 00:07:27.113 "log_enable_timestamps", 00:07:27.113 "log_get_flags", 00:07:27.113 "log_clear_flag", 00:07:27.113 "log_set_flag", 00:07:27.113 "log_get_level", 00:07:27.113 "log_set_level", 00:07:27.113 "log_get_print_level", 00:07:27.113 "log_set_print_level", 00:07:27.113 "framework_enable_cpumask_locks", 00:07:27.113 "framework_disable_cpumask_locks", 00:07:27.113 "framework_wait_init", 00:07:27.113 "framework_start_init", 00:07:27.113 "scsi_get_devices", 00:07:27.113 "bdev_get_histogram", 00:07:27.113 "bdev_enable_histogram", 00:07:27.113 "bdev_set_qos_limit", 00:07:27.113 "bdev_set_qd_sampling_period", 00:07:27.113 "bdev_get_bdevs", 00:07:27.113 "bdev_reset_iostat", 00:07:27.113 "bdev_get_iostat", 00:07:27.113 "bdev_examine", 00:07:27.113 "bdev_wait_for_examine", 00:07:27.113 "bdev_set_options", 00:07:27.113 "notify_get_notifications", 00:07:27.113 "notify_get_types", 00:07:27.113 "accel_get_stats", 00:07:27.113 "accel_set_options", 00:07:27.113 "accel_set_driver", 00:07:27.113 "accel_crypto_key_destroy", 00:07:27.113 "accel_crypto_keys_get", 00:07:27.113 "accel_crypto_key_create", 00:07:27.113 "accel_assign_opc", 00:07:27.113 "accel_get_module_info", 00:07:27.113 "accel_get_opc_assignments", 00:07:27.113 "vmd_rescan", 00:07:27.113 "vmd_remove_device", 00:07:27.113 "vmd_enable", 00:07:27.113 "sock_get_default_impl", 00:07:27.113 "sock_set_default_impl", 00:07:27.113 "sock_impl_set_options", 00:07:27.113 "sock_impl_get_options", 00:07:27.113 "iobuf_get_stats", 00:07:27.113 "iobuf_set_options", 00:07:27.113 "framework_get_pci_devices", 00:07:27.113 "framework_get_config", 00:07:27.113 "framework_get_subsystems", 00:07:27.113 "trace_get_info", 00:07:27.113 "trace_get_tpoint_group_mask", 00:07:27.113 "trace_disable_tpoint_group", 00:07:27.113 "trace_enable_tpoint_group", 00:07:27.113 "trace_clear_tpoint_mask", 00:07:27.113 "trace_set_tpoint_mask", 00:07:27.113 "keyring_get_keys", 00:07:27.113 "spdk_get_version", 00:07:27.113 "rpc_get_methods" 00:07:27.113 ] 00:07:27.113 22:36:11 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:27.113 22:36:11 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:27.113 22:36:11 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:27.113 22:36:11 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:27.113 22:36:11 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 2664148 00:07:27.113 22:36:11 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 2664148 ']' 00:07:27.113 22:36:11 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 2664148 00:07:27.113 22:36:11 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:27.113 22:36:11 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:27.113 22:36:11 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2664148 00:07:27.113 22:36:12 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:27.113 22:36:12 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:27.113 22:36:12 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2664148' 00:07:27.113 killing process with pid 2664148 00:07:27.113 22:36:12 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 2664148 00:07:27.113 22:36:12 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 2664148 00:07:27.683 00:07:27.683 real 0m1.872s 00:07:27.683 user 0m3.397s 00:07:27.683 sys 0m0.631s 00:07:27.683 22:36:12 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:27.683 22:36:12 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:27.683 ************************************ 00:07:27.683 END TEST spdkcli_tcp 00:07:27.683 ************************************ 00:07:27.683 22:36:12 -- common/autotest_common.sh@1142 -- # return 0 00:07:27.683 22:36:12 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:27.683 22:36:12 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:27.683 22:36:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:27.683 22:36:12 -- common/autotest_common.sh@10 -- # set +x 00:07:27.683 ************************************ 00:07:27.683 START TEST dpdk_mem_utility 00:07:27.683 ************************************ 00:07:27.683 22:36:12 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:27.683 * Looking for test storage... 00:07:27.942 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:27.942 22:36:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:27.942 22:36:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2664492 00:07:27.942 22:36:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2664492 00:07:27.942 22:36:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:27.943 22:36:12 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 2664492 ']' 00:07:27.943 22:36:12 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:27.943 22:36:12 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:27.943 22:36:12 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:27.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:27.943 22:36:12 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:27.943 22:36:12 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:27.943 [2024-07-15 22:36:12.662563] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:27.943 [2024-07-15 22:36:12.662625] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2664492 ] 00:07:27.943 [2024-07-15 22:36:12.777854] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.202 [2024-07-15 22:36:12.887989] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.461 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:28.461 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:07:28.461 22:36:13 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:28.461 22:36:13 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:28.461 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:28.461 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:28.461 { 00:07:28.461 "filename": "/tmp/spdk_mem_dump.txt" 00:07:28.461 } 00:07:28.461 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:28.461 22:36:13 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:28.461 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:28.461 2 heaps totaling size 816.000000 MiB 00:07:28.461 size: 814.000000 MiB heap id: 0 00:07:28.462 size: 2.000000 MiB heap id: 1 00:07:28.462 end heaps---------- 00:07:28.462 8 mempools totaling size 598.116089 MiB 00:07:28.462 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:28.462 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:28.462 size: 84.521057 MiB name: bdev_io_2664492 00:07:28.462 size: 51.011292 MiB name: evtpool_2664492 00:07:28.462 size: 50.003479 MiB name: msgpool_2664492 00:07:28.462 size: 21.763794 MiB name: PDU_Pool 00:07:28.462 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:28.462 size: 0.026123 MiB name: Session_Pool 00:07:28.462 end mempools------- 00:07:28.462 201 memzones totaling size 4.176453 MiB 00:07:28.462 size: 1.000366 MiB name: RG_ring_0_2664492 00:07:28.462 size: 1.000366 MiB name: RG_ring_1_2664492 00:07:28.462 size: 1.000366 MiB name: RG_ring_4_2664492 00:07:28.462 size: 1.000366 MiB name: RG_ring_5_2664492 00:07:28.462 size: 0.125366 MiB name: RG_ring_2_2664492 00:07:28.462 size: 0.015991 MiB name: RG_ring_3_2664492 00:07:28.462 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:28.462 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:28.462 size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:28.462 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:28.462 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:28.462 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:28.463 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:28.463 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:28.463 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:28.463 end memzones------- 00:07:28.724 22:36:13 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:28.724 heap id: 0 total size: 814.000000 MiB number of busy elements: 525 number of free elements: 14 00:07:28.724 list of free elements. size: 11.813721 MiB 00:07:28.724 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:28.724 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:28.724 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:28.724 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:28.724 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:28.724 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:28.724 element at address: 0x200007000000 with size: 0.959839 MiB 00:07:28.724 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:28.724 element at address: 0x20001aa00000 with size: 0.582520 MiB 00:07:28.724 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:28.725 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:28.725 element at address: 0x200000800000 with size: 0.486877 MiB 00:07:28.725 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:28.725 element at address: 0x200027e00000 with size: 0.402527 MiB 00:07:28.725 list of standard malloc elements. size: 199.877991 MiB 00:07:28.725 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:28.725 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:28.725 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:28.725 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:28.725 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:28.725 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:28.725 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:28.725 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:28.725 element at address: 0x200000330b40 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000337640 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000033e140 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000344c40 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000034b740 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000352240 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000358d40 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000035f840 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000366880 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000036a340 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000036de00 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000375380 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000378e40 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000037c900 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000383e80 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000387940 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000038b400 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000392980 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000396440 with size: 0.004395 MiB 00:07:28.725 element at address: 0x200000399f00 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:07:28.725 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:28.725 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000333040 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000335540 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000339b40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000033c040 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000340640 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000342b40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000347140 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000349640 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000350140 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000354740 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000356c40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000035b240 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000035d740 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000361d40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000364780 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000365800 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000368240 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000370840 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000373280 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000374300 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000376d40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000037a800 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000037b880 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000037f340 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000381d80 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000382e00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000385840 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000389300 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000038a380 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000038de40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000390880 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000391900 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000394340 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000397e00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000398e80 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000039c940 with size: 0.004028 MiB 00:07:28.725 element at address: 0x20000039f380 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:28.725 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:28.725 element at address: 0x200000204bc0 with size: 0.000305 MiB 00:07:28.725 element at address: 0x200000200000 with size: 0.000183 MiB 00:07:28.725 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:07:28.725 element at address: 0x200000200180 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200240 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200300 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200480 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200540 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200600 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200780 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200840 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200900 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200a80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200b40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200c00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200d80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200e40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200f00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201080 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201140 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201200 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201380 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201440 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201500 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201680 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201740 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201800 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201980 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201a40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201b00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201c80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201d40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201e00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000201f80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202040 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202100 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202280 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202340 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202400 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202580 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202640 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202700 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202880 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202940 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202a00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202b80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202c40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202d00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202e80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000202f40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203000 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203180 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203240 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203300 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203480 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203540 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203600 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203780 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203840 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203900 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203a80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203b40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203c00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203d80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203e40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203f00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204080 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204140 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204200 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204380 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204440 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204500 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204680 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204740 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204800 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204980 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204a40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204b00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204d00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204dc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204e80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000204f40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205000 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205180 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205240 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205300 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205480 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205540 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205600 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205780 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205840 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205900 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205a80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205b40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205c00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205d80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205e40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205f00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000206080 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000206140 with size: 0.000183 MiB 00:07:28.726 element at address: 0x200000206200 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000020a780 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022af80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b040 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b100 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b280 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b340 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b400 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b580 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b640 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b700 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b7c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022be40 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022c080 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022c140 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022c200 with size: 0.000183 MiB 00:07:28.726 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000022c380 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000022c440 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000022c500 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000032e700 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000331d40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000338840 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000033f340 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000345e40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000034c940 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000353440 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000359f40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000360a40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000364180 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000364240 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000364400 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000367a80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000367c40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000367d00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036b540 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036b700 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036b980 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036f000 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036f280 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000036f440 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000372c80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000372d40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000372f00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000376580 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000376740 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000376800 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037a040 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037a200 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037a480 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037db00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000037df40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000381780 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000381840 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000381a00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000385080 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000385240 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000385300 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000388b40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000388d00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000388f80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000038c600 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000038c880 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000390280 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000390340 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000390500 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000393b80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000393d40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000393e00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000397640 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000397800 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200000397a80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039b100 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039b380 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039b540 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000039f000 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:28.727 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:07:28.727 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200027e670c0 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200027e67180 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200027e6dd80 with size: 0.000183 MiB 00:07:28.727 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:28.728 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:28.728 list of memzone associated elements. size: 602.308289 MiB 00:07:28.728 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:28.728 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:28.728 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:28.728 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:28.728 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:28.728 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2664492_0 00:07:28.728 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:28.728 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2664492_0 00:07:28.728 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:28.728 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2664492_0 00:07:28.728 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:28.728 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:28.728 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:28.728 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:28.728 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:28.728 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2664492 00:07:28.728 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:28.728 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2664492 00:07:28.728 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:07:28.728 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2664492 00:07:28.728 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:28.728 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:28.728 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:28.728 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:28.728 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:28.728 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:28.728 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:28.728 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:28.728 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:28.728 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2664492 00:07:28.728 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:28.728 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2664492 00:07:28.728 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:28.728 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2664492 00:07:28.728 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:28.728 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2664492 00:07:28.728 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:28.728 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2664492 00:07:28.728 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:28.728 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:28.728 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:28.728 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:28.728 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:28.728 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:28.728 element at address: 0x20000020a840 with size: 0.125488 MiB 00:07:28.728 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2664492 00:07:28.728 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:07:28.728 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:28.728 element at address: 0x200027e67240 with size: 0.023743 MiB 00:07:28.728 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:28.728 element at address: 0x200000206580 with size: 0.016113 MiB 00:07:28.728 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2664492 00:07:28.728 element at address: 0x200027e6d380 with size: 0.002441 MiB 00:07:28.728 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:28.728 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:07:28.728 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:28.728 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:07:28.728 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:07:28.728 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:07:28.728 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:07:28.728 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:07:28.728 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:07:28.728 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:07:28.728 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:07:28.728 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:07:28.728 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:07:28.728 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:07:28.728 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:07:28.728 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:07:28.728 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:07:28.728 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:07:28.728 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:07:28.728 element at address: 0x20000039b700 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:07:28.728 element at address: 0x200000397c40 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:07:28.728 element at address: 0x200000394180 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:07:28.728 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:07:28.728 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:07:28.728 element at address: 0x200000389140 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:07:28.728 element at address: 0x200000385680 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:07:28.728 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:07:28.728 element at address: 0x20000037e100 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:07:28.728 element at address: 0x20000037a640 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:07:28.728 element at address: 0x200000376b80 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:07:28.728 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:07:28.728 element at address: 0x20000036f600 with size: 0.000427 MiB 00:07:28.728 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:07:28.729 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:07:28.729 element at address: 0x200000368080 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:07:28.729 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:07:28.729 element at address: 0x200000360b00 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:07:28.729 element at address: 0x20000035d580 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:07:28.729 element at address: 0x20000035a000 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:07:28.729 element at address: 0x200000356a80 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:07:28.729 element at address: 0x200000353500 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:07:28.729 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:07:28.729 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:07:28.729 element at address: 0x200000349480 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:07:28.729 element at address: 0x200000345f00 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:07:28.729 element at address: 0x200000342980 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:07:28.729 element at address: 0x20000033f400 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:07:28.729 element at address: 0x20000033be80 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:07:28.729 element at address: 0x200000338900 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:07:28.729 element at address: 0x200000335380 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:07:28.729 element at address: 0x200000331e00 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:07:28.729 element at address: 0x20000032e880 with size: 0.000427 MiB 00:07:28.729 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:07:28.729 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:07:28.729 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:28.729 element at address: 0x20000022b880 with size: 0.000305 MiB 00:07:28.729 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2664492 00:07:28.729 element at address: 0x200000206380 with size: 0.000305 MiB 00:07:28.729 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2664492 00:07:28.729 element at address: 0x200027e6de40 with size: 0.000305 MiB 00:07:28.729 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:28.729 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:28.729 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:28.729 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:07:28.729 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:28.729 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:28.729 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:07:28.729 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:28.729 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:28.729 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:07:28.729 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:28.729 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:28.729 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:07:28.729 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:28.729 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:28.729 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:07:28.729 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:28.729 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:28.729 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:07:28.729 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:28.729 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:28.729 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:07:28.729 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:28.729 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:28.729 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:07:28.729 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:28.729 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:28.729 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:07:28.729 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:28.729 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:28.729 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:07:28.729 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:28.729 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:28.729 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:07:28.729 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:28.729 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:28.729 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:07:28.729 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:28.729 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:07:28.729 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:28.729 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:07:28.730 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:28.730 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:28.730 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:07:28.730 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:28.730 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:28.730 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:07:28.730 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:28.730 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:28.730 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:07:28.730 element at address: 0x20000039b600 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:28.730 element at address: 0x20000039b440 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:28.730 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:07:28.730 element at address: 0x200000397b40 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:28.730 element at address: 0x200000397980 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:28.730 element at address: 0x200000397700 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:07:28.730 element at address: 0x200000394080 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:28.730 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:28.730 element at address: 0x200000393c40 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:07:28.730 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:28.730 element at address: 0x200000390400 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:28.730 element at address: 0x200000390180 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:07:28.730 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:28.730 element at address: 0x20000038c940 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:28.730 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:07:28.730 element at address: 0x200000389040 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:28.730 element at address: 0x200000388e80 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:28.730 element at address: 0x200000388c00 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:07:28.730 element at address: 0x200000385580 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:28.730 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:28.730 element at address: 0x200000385140 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:07:28.730 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:28.730 element at address: 0x200000381900 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:28.730 element at address: 0x200000381680 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:07:28.730 element at address: 0x20000037e000 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:28.730 element at address: 0x20000037de40 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:28.730 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:07:28.730 element at address: 0x20000037a540 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:28.730 element at address: 0x20000037a380 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:28.730 element at address: 0x20000037a100 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:07:28.730 element at address: 0x200000376a80 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:28.730 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:28.730 element at address: 0x200000376640 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:07:28.730 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:28.730 element at address: 0x200000372e00 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:28.730 element at address: 0x200000372b80 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:07:28.730 element at address: 0x20000036f500 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:28.730 element at address: 0x20000036f340 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:28.730 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:07:28.730 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:28.730 element at address: 0x20000036b880 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:28.730 element at address: 0x20000036b600 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:07:28.730 element at address: 0x200000367f80 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:28.730 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:28.730 element at address: 0x200000367b40 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:07:28.730 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:28.730 element at address: 0x200000364300 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:28.730 element at address: 0x200000364080 with size: 0.000244 MiB 00:07:28.730 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:07:28.730 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:07:28.730 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:28.730 22:36:13 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:28.730 22:36:13 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2664492 00:07:28.730 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 2664492 ']' 00:07:28.730 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 2664492 00:07:28.730 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:07:28.730 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:28.730 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2664492 00:07:28.989 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:28.989 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:28.989 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2664492' 00:07:28.989 killing process with pid 2664492 00:07:28.989 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 2664492 00:07:28.989 22:36:13 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 2664492 00:07:29.249 00:07:29.249 real 0m1.538s 00:07:29.249 user 0m1.939s 00:07:29.249 sys 0m0.578s 00:07:29.249 22:36:14 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:29.249 22:36:14 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:29.249 ************************************ 00:07:29.249 END TEST dpdk_mem_utility 00:07:29.249 ************************************ 00:07:29.249 22:36:14 -- common/autotest_common.sh@1142 -- # return 0 00:07:29.249 22:36:14 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:29.249 22:36:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:29.249 22:36:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.249 22:36:14 -- common/autotest_common.sh@10 -- # set +x 00:07:29.249 ************************************ 00:07:29.249 START TEST event 00:07:29.249 ************************************ 00:07:29.249 22:36:14 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:29.508 * Looking for test storage... 00:07:29.508 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:29.508 22:36:14 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:29.508 22:36:14 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:29.508 22:36:14 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:29.508 22:36:14 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:07:29.508 22:36:14 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:29.508 22:36:14 event -- common/autotest_common.sh@10 -- # set +x 00:07:29.508 ************************************ 00:07:29.508 START TEST event_perf 00:07:29.508 ************************************ 00:07:29.508 22:36:14 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:29.508 Running I/O for 1 seconds...[2024-07-15 22:36:14.304508] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:29.508 [2024-07-15 22:36:14.304587] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2664731 ] 00:07:29.767 [2024-07-15 22:36:14.433094] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:29.767 [2024-07-15 22:36:14.535414] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:29.767 [2024-07-15 22:36:14.535516] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:29.767 [2024-07-15 22:36:14.535618] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:29.767 [2024-07-15 22:36:14.535620] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.142 Running I/O for 1 seconds... 00:07:31.142 lcore 0: 102484 00:07:31.142 lcore 1: 102488 00:07:31.142 lcore 2: 102490 00:07:31.142 lcore 3: 102487 00:07:31.142 done. 00:07:31.142 00:07:31.142 real 0m1.355s 00:07:31.142 user 0m4.205s 00:07:31.142 sys 0m0.137s 00:07:31.142 22:36:15 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:31.142 22:36:15 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:31.142 ************************************ 00:07:31.142 END TEST event_perf 00:07:31.142 ************************************ 00:07:31.142 22:36:15 event -- common/autotest_common.sh@1142 -- # return 0 00:07:31.142 22:36:15 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:31.142 22:36:15 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:31.142 22:36:15 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:31.142 22:36:15 event -- common/autotest_common.sh@10 -- # set +x 00:07:31.142 ************************************ 00:07:31.142 START TEST event_reactor 00:07:31.142 ************************************ 00:07:31.142 22:36:15 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:31.142 [2024-07-15 22:36:15.739016] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:31.142 [2024-07-15 22:36:15.739077] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2664927 ] 00:07:31.142 [2024-07-15 22:36:15.867031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.142 [2024-07-15 22:36:15.967297] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.515 test_start 00:07:32.515 oneshot 00:07:32.515 tick 100 00:07:32.515 tick 100 00:07:32.515 tick 250 00:07:32.515 tick 100 00:07:32.515 tick 100 00:07:32.515 tick 250 00:07:32.515 tick 100 00:07:32.515 tick 500 00:07:32.515 tick 100 00:07:32.515 tick 100 00:07:32.515 tick 250 00:07:32.515 tick 100 00:07:32.515 tick 100 00:07:32.515 test_end 00:07:32.515 00:07:32.515 real 0m1.349s 00:07:32.515 user 0m1.208s 00:07:32.515 sys 0m0.135s 00:07:32.515 22:36:17 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:32.515 22:36:17 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:32.515 ************************************ 00:07:32.515 END TEST event_reactor 00:07:32.515 ************************************ 00:07:32.515 22:36:17 event -- common/autotest_common.sh@1142 -- # return 0 00:07:32.515 22:36:17 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:32.515 22:36:17 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:07:32.515 22:36:17 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:32.515 22:36:17 event -- common/autotest_common.sh@10 -- # set +x 00:07:32.515 ************************************ 00:07:32.515 START TEST event_reactor_perf 00:07:32.515 ************************************ 00:07:32.515 22:36:17 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:32.515 [2024-07-15 22:36:17.171472] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:32.515 [2024-07-15 22:36:17.171540] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2665122 ] 00:07:32.515 [2024-07-15 22:36:17.300792] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.515 [2024-07-15 22:36:17.405478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.888 test_start 00:07:33.888 test_end 00:07:33.888 Performance: 328087 events per second 00:07:33.888 00:07:33.888 real 0m1.355s 00:07:33.888 user 0m1.209s 00:07:33.888 sys 0m0.140s 00:07:33.888 22:36:18 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:33.888 22:36:18 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:33.888 ************************************ 00:07:33.888 END TEST event_reactor_perf 00:07:33.888 ************************************ 00:07:33.888 22:36:18 event -- common/autotest_common.sh@1142 -- # return 0 00:07:33.888 22:36:18 event -- event/event.sh@49 -- # uname -s 00:07:33.888 22:36:18 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:33.888 22:36:18 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:33.888 22:36:18 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:33.888 22:36:18 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.888 22:36:18 event -- common/autotest_common.sh@10 -- # set +x 00:07:33.888 ************************************ 00:07:33.888 START TEST event_scheduler 00:07:33.888 ************************************ 00:07:33.888 22:36:18 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:33.888 * Looking for test storage... 00:07:33.888 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:33.888 22:36:18 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:33.888 22:36:18 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=2665388 00:07:33.888 22:36:18 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:33.888 22:36:18 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:33.888 22:36:18 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 2665388 00:07:33.888 22:36:18 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 2665388 ']' 00:07:33.888 22:36:18 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:33.888 22:36:18 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:33.888 22:36:18 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:33.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:33.888 22:36:18 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:33.888 22:36:18 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:33.888 [2024-07-15 22:36:18.758694] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:33.888 [2024-07-15 22:36:18.758772] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2665388 ] 00:07:34.148 [2024-07-15 22:36:18.952558] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:34.407 [2024-07-15 22:36:19.145794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.407 [2024-07-15 22:36:19.145884] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:34.407 [2024-07-15 22:36:19.146001] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:34.407 [2024-07-15 22:36:19.146013] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:34.407 22:36:19 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:34.407 22:36:19 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:07:34.407 22:36:19 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:34.407 22:36:19 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.407 22:36:19 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:34.407 [2024-07-15 22:36:19.231647] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:34.407 [2024-07-15 22:36:19.231699] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:34.407 [2024-07-15 22:36:19.231733] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:34.407 [2024-07-15 22:36:19.231759] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:34.407 [2024-07-15 22:36:19.231783] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:34.407 22:36:19 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.407 22:36:19 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:34.407 22:36:19 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.407 22:36:19 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 [2024-07-15 22:36:19.379339] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:34.667 22:36:19 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:34.667 22:36:19 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:34.667 22:36:19 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 ************************************ 00:07:34.667 START TEST scheduler_create_thread 00:07:34.667 ************************************ 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 2 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 3 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 4 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 5 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 6 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 7 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 8 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 9 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:34.667 10 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:34.667 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:35.235 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:35.235 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:35.235 22:36:19 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:35.235 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:35.235 22:36:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:36.172 22:36:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:36.172 22:36:20 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:36.172 22:36:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:36.172 22:36:20 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:37.108 22:36:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:37.108 22:36:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:37.108 22:36:21 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:37.108 22:36:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:37.108 22:36:21 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:38.041 22:36:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:38.041 00:07:38.041 real 0m3.233s 00:07:38.041 user 0m0.026s 00:07:38.041 sys 0m0.006s 00:07:38.041 22:36:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.041 22:36:22 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:38.041 ************************************ 00:07:38.041 END TEST scheduler_create_thread 00:07:38.041 ************************************ 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:07:38.041 22:36:22 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:38.041 22:36:22 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 2665388 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 2665388 ']' 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 2665388 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2665388 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2665388' 00:07:38.041 killing process with pid 2665388 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 2665388 00:07:38.041 22:36:22 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 2665388 00:07:38.300 [2024-07-15 22:36:23.034646] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:38.559 00:07:38.559 real 0m4.849s 00:07:38.559 user 0m8.242s 00:07:38.559 sys 0m0.630s 00:07:38.559 22:36:23 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:38.559 22:36:23 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:38.559 ************************************ 00:07:38.559 END TEST event_scheduler 00:07:38.559 ************************************ 00:07:38.818 22:36:23 event -- common/autotest_common.sh@1142 -- # return 0 00:07:38.818 22:36:23 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:38.818 22:36:23 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:38.818 22:36:23 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:38.818 22:36:23 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:38.818 22:36:23 event -- common/autotest_common.sh@10 -- # set +x 00:07:38.818 ************************************ 00:07:38.818 START TEST app_repeat 00:07:38.818 ************************************ 00:07:38.818 22:36:23 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@19 -- # repeat_pid=2666094 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2666094' 00:07:38.818 Process app_repeat pid: 2666094 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:38.818 spdk_app_start Round 0 00:07:38.818 22:36:23 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2666094 /var/tmp/spdk-nbd.sock 00:07:38.818 22:36:23 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2666094 ']' 00:07:38.818 22:36:23 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:38.818 22:36:23 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:38.818 22:36:23 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:38.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:38.818 22:36:23 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:38.818 22:36:23 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:38.818 [2024-07-15 22:36:23.573805] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:38.818 [2024-07-15 22:36:23.573873] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2666094 ] 00:07:38.818 [2024-07-15 22:36:23.706658] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:39.077 [2024-07-15 22:36:23.810367] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:39.077 [2024-07-15 22:36:23.810371] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.643 22:36:24 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:39.643 22:36:24 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:39.643 22:36:24 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:39.901 Malloc0 00:07:40.159 22:36:24 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:40.159 Malloc1 00:07:40.418 22:36:25 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:40.418 22:36:25 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:40.418 /dev/nbd0 00:07:40.677 22:36:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:40.677 22:36:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:40.677 1+0 records in 00:07:40.677 1+0 records out 00:07:40.677 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313463 s, 13.1 MB/s 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:40.677 22:36:25 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:40.677 22:36:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:40.677 22:36:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:40.677 22:36:25 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:40.967 /dev/nbd1 00:07:40.967 22:36:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:40.967 22:36:25 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:40.967 1+0 records in 00:07:40.967 1+0 records out 00:07:40.967 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000511268 s, 8.0 MB/s 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:40.967 22:36:25 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:40.968 22:36:25 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:40.968 22:36:25 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:40.968 22:36:25 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:40.968 22:36:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:40.968 22:36:25 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:40.968 22:36:25 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:40.968 22:36:25 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.968 22:36:25 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:41.243 { 00:07:41.243 "nbd_device": "/dev/nbd0", 00:07:41.243 "bdev_name": "Malloc0" 00:07:41.243 }, 00:07:41.243 { 00:07:41.243 "nbd_device": "/dev/nbd1", 00:07:41.243 "bdev_name": "Malloc1" 00:07:41.243 } 00:07:41.243 ]' 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:41.243 { 00:07:41.243 "nbd_device": "/dev/nbd0", 00:07:41.243 "bdev_name": "Malloc0" 00:07:41.243 }, 00:07:41.243 { 00:07:41.243 "nbd_device": "/dev/nbd1", 00:07:41.243 "bdev_name": "Malloc1" 00:07:41.243 } 00:07:41.243 ]' 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:41.243 /dev/nbd1' 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:41.243 /dev/nbd1' 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:41.243 256+0 records in 00:07:41.243 256+0 records out 00:07:41.243 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108361 s, 96.8 MB/s 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:41.243 22:36:25 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:41.243 256+0 records in 00:07:41.243 256+0 records out 00:07:41.243 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.030045 s, 34.9 MB/s 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:41.243 256+0 records in 00:07:41.243 256+0 records out 00:07:41.243 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0316287 s, 33.2 MB/s 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.243 22:36:26 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.502 22:36:26 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:41.761 22:36:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:42.021 22:36:26 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:42.021 22:36:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:42.021 22:36:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:42.280 22:36:26 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:42.280 22:36:26 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:42.280 22:36:27 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:42.539 [2024-07-15 22:36:27.406302] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:42.797 [2024-07-15 22:36:27.505469] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:42.797 [2024-07-15 22:36:27.505472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.797 [2024-07-15 22:36:27.557567] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:42.797 [2024-07-15 22:36:27.557619] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:45.326 22:36:30 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:45.326 22:36:30 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:45.326 spdk_app_start Round 1 00:07:45.326 22:36:30 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2666094 /var/tmp/spdk-nbd.sock 00:07:45.326 22:36:30 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2666094 ']' 00:07:45.326 22:36:30 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:45.326 22:36:30 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:45.326 22:36:30 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:45.326 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:45.326 22:36:30 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:45.326 22:36:30 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:45.585 22:36:30 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:45.585 22:36:30 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:45.585 22:36:30 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:45.844 Malloc0 00:07:45.844 22:36:30 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:46.101 Malloc1 00:07:46.101 22:36:30 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:46.101 22:36:30 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:46.358 /dev/nbd0 00:07:46.358 22:36:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:46.358 22:36:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:46.358 1+0 records in 00:07:46.358 1+0 records out 00:07:46.358 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255311 s, 16.0 MB/s 00:07:46.358 22:36:31 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:46.616 22:36:31 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:46.616 22:36:31 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:46.616 22:36:31 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:46.616 22:36:31 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:46.616 22:36:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:46.616 22:36:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:46.616 22:36:31 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:46.616 /dev/nbd1 00:07:46.874 22:36:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:46.874 22:36:31 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:46.874 1+0 records in 00:07:46.874 1+0 records out 00:07:46.874 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277787 s, 14.7 MB/s 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:46.874 22:36:31 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:46.874 22:36:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:46.874 22:36:31 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:46.874 22:36:31 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:46.874 22:36:31 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:46.874 22:36:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:47.133 { 00:07:47.133 "nbd_device": "/dev/nbd0", 00:07:47.133 "bdev_name": "Malloc0" 00:07:47.133 }, 00:07:47.133 { 00:07:47.133 "nbd_device": "/dev/nbd1", 00:07:47.133 "bdev_name": "Malloc1" 00:07:47.133 } 00:07:47.133 ]' 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:47.133 { 00:07:47.133 "nbd_device": "/dev/nbd0", 00:07:47.133 "bdev_name": "Malloc0" 00:07:47.133 }, 00:07:47.133 { 00:07:47.133 "nbd_device": "/dev/nbd1", 00:07:47.133 "bdev_name": "Malloc1" 00:07:47.133 } 00:07:47.133 ]' 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:47.133 /dev/nbd1' 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:47.133 /dev/nbd1' 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:47.133 22:36:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:47.134 256+0 records in 00:07:47.134 256+0 records out 00:07:47.134 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103541 s, 101 MB/s 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:47.134 256+0 records in 00:07:47.134 256+0 records out 00:07:47.134 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.030135 s, 34.8 MB/s 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:47.134 256+0 records in 00:07:47.134 256+0 records out 00:07:47.134 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0316797 s, 33.1 MB/s 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.134 22:36:31 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:47.391 22:36:32 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:47.648 22:36:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:47.906 22:36:32 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:48.164 22:36:32 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:48.164 22:36:32 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:48.164 22:36:32 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:48.164 22:36:32 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:48.423 22:36:33 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:48.423 [2024-07-15 22:36:33.315386] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:48.682 [2024-07-15 22:36:33.412788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:48.682 [2024-07-15 22:36:33.412791] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.682 [2024-07-15 22:36:33.464415] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:48.682 [2024-07-15 22:36:33.464463] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:51.215 22:36:36 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:51.215 22:36:36 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:51.215 spdk_app_start Round 2 00:07:51.215 22:36:36 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2666094 /var/tmp/spdk-nbd.sock 00:07:51.215 22:36:36 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2666094 ']' 00:07:51.215 22:36:36 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:51.215 22:36:36 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:51.215 22:36:36 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:51.215 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:51.215 22:36:36 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:51.215 22:36:36 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:51.473 22:36:36 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:51.473 22:36:36 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:51.473 22:36:36 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:51.731 Malloc0 00:07:51.731 22:36:36 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:51.990 Malloc1 00:07:52.249 22:36:36 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:52.249 22:36:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:52.250 22:36:36 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:52.250 22:36:36 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:52.250 22:36:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:52.250 22:36:36 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:52.250 22:36:36 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:52.250 /dev/nbd0 00:07:52.509 22:36:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:52.509 22:36:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:52.509 1+0 records in 00:07:52.509 1+0 records out 00:07:52.509 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000353364 s, 11.6 MB/s 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:52.509 22:36:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:52.509 22:36:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:52.509 22:36:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:52.509 22:36:37 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:52.768 /dev/nbd1 00:07:52.768 22:36:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:52.768 22:36:37 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:52.768 1+0 records in 00:07:52.768 1+0 records out 00:07:52.768 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271538 s, 15.1 MB/s 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:07:52.768 22:36:37 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:07:52.768 22:36:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:52.768 22:36:37 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:52.768 22:36:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:52.768 22:36:37 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:52.768 22:36:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:53.027 { 00:07:53.027 "nbd_device": "/dev/nbd0", 00:07:53.027 "bdev_name": "Malloc0" 00:07:53.027 }, 00:07:53.027 { 00:07:53.027 "nbd_device": "/dev/nbd1", 00:07:53.027 "bdev_name": "Malloc1" 00:07:53.027 } 00:07:53.027 ]' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:53.027 { 00:07:53.027 "nbd_device": "/dev/nbd0", 00:07:53.027 "bdev_name": "Malloc0" 00:07:53.027 }, 00:07:53.027 { 00:07:53.027 "nbd_device": "/dev/nbd1", 00:07:53.027 "bdev_name": "Malloc1" 00:07:53.027 } 00:07:53.027 ]' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:53.027 /dev/nbd1' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:53.027 /dev/nbd1' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:53.027 256+0 records in 00:07:53.027 256+0 records out 00:07:53.027 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109399 s, 95.8 MB/s 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:53.027 256+0 records in 00:07:53.027 256+0 records out 00:07:53.027 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0303062 s, 34.6 MB/s 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:53.027 256+0 records in 00:07:53.027 256+0 records out 00:07:53.027 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0315794 s, 33.2 MB/s 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.027 22:36:37 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.286 22:36:38 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:53.852 22:36:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:54.111 22:36:38 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:54.111 22:36:38 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:54.369 22:36:39 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:54.628 [2024-07-15 22:36:39.296025] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:54.628 [2024-07-15 22:36:39.394966] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:54.628 [2024-07-15 22:36:39.394969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.628 [2024-07-15 22:36:39.445942] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:54.628 [2024-07-15 22:36:39.445995] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:57.912 22:36:42 event.app_repeat -- event/event.sh@38 -- # waitforlisten 2666094 /var/tmp/spdk-nbd.sock 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2666094 ']' 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:57.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:07:57.912 22:36:42 event.app_repeat -- event/event.sh@39 -- # killprocess 2666094 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 2666094 ']' 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 2666094 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2666094 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2666094' 00:07:57.912 killing process with pid 2666094 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@967 -- # kill 2666094 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@972 -- # wait 2666094 00:07:57.912 spdk_app_start is called in Round 0. 00:07:57.912 Shutdown signal received, stop current app iteration 00:07:57.912 Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 reinitialization... 00:07:57.912 spdk_app_start is called in Round 1. 00:07:57.912 Shutdown signal received, stop current app iteration 00:07:57.912 Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 reinitialization... 00:07:57.912 spdk_app_start is called in Round 2. 00:07:57.912 Shutdown signal received, stop current app iteration 00:07:57.912 Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 reinitialization... 00:07:57.912 spdk_app_start is called in Round 3. 00:07:57.912 Shutdown signal received, stop current app iteration 00:07:57.912 22:36:42 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:57.912 22:36:42 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:57.912 00:07:57.912 real 0m19.059s 00:07:57.912 user 0m41.401s 00:07:57.912 sys 0m3.934s 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.912 22:36:42 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:57.912 ************************************ 00:07:57.912 END TEST app_repeat 00:07:57.912 ************************************ 00:07:57.912 22:36:42 event -- common/autotest_common.sh@1142 -- # return 0 00:07:57.912 22:36:42 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:57.912 00:07:57.912 real 0m28.514s 00:07:57.912 user 0m56.462s 00:07:57.912 sys 0m5.367s 00:07:57.912 22:36:42 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.913 22:36:42 event -- common/autotest_common.sh@10 -- # set +x 00:07:57.913 ************************************ 00:07:57.913 END TEST event 00:07:57.913 ************************************ 00:07:57.913 22:36:42 -- common/autotest_common.sh@1142 -- # return 0 00:07:57.913 22:36:42 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:57.913 22:36:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:57.913 22:36:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.913 22:36:42 -- common/autotest_common.sh@10 -- # set +x 00:07:57.913 ************************************ 00:07:57.913 START TEST thread 00:07:57.913 ************************************ 00:07:57.913 22:36:42 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:07:57.913 * Looking for test storage... 00:07:57.913 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:07:57.913 22:36:42 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:57.913 22:36:42 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:57.913 22:36:42 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.913 22:36:42 thread -- common/autotest_common.sh@10 -- # set +x 00:07:58.172 ************************************ 00:07:58.172 START TEST thread_poller_perf 00:07:58.172 ************************************ 00:07:58.172 22:36:42 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:58.172 [2024-07-15 22:36:42.885982] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:58.172 [2024-07-15 22:36:42.886052] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2668813 ] 00:07:58.172 [2024-07-15 22:36:43.021142] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.431 [2024-07-15 22:36:43.132172] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.431 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:59.366 ====================================== 00:07:59.367 busy:2314563186 (cyc) 00:07:59.367 total_run_count: 266000 00:07:59.367 tsc_hz: 2300000000 (cyc) 00:07:59.367 ====================================== 00:07:59.367 poller_cost: 8701 (cyc), 3783 (nsec) 00:07:59.367 00:07:59.367 real 0m1.380s 00:07:59.367 user 0m1.233s 00:07:59.367 sys 0m0.141s 00:07:59.367 22:36:44 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:59.367 22:36:44 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:59.367 ************************************ 00:07:59.367 END TEST thread_poller_perf 00:07:59.367 ************************************ 00:07:59.636 22:36:44 thread -- common/autotest_common.sh@1142 -- # return 0 00:07:59.636 22:36:44 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:59.636 22:36:44 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:07:59.636 22:36:44 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.636 22:36:44 thread -- common/autotest_common.sh@10 -- # set +x 00:07:59.636 ************************************ 00:07:59.636 START TEST thread_poller_perf 00:07:59.636 ************************************ 00:07:59.636 22:36:44 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:59.636 [2024-07-15 22:36:44.351181] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:07:59.636 [2024-07-15 22:36:44.351244] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2669048 ] 00:07:59.636 [2024-07-15 22:36:44.478473] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.895 [2024-07-15 22:36:44.576641] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.895 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:00.888 ====================================== 00:08:00.888 busy:2302362014 (cyc) 00:08:00.888 total_run_count: 3495000 00:08:00.888 tsc_hz: 2300000000 (cyc) 00:08:00.888 ====================================== 00:08:00.888 poller_cost: 658 (cyc), 286 (nsec) 00:08:00.888 00:08:00.888 real 0m1.346s 00:08:00.888 user 0m1.213s 00:08:00.888 sys 0m0.128s 00:08:00.888 22:36:45 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.888 22:36:45 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:00.888 ************************************ 00:08:00.888 END TEST thread_poller_perf 00:08:00.888 ************************************ 00:08:00.888 22:36:45 thread -- common/autotest_common.sh@1142 -- # return 0 00:08:00.888 22:36:45 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:08:00.888 00:08:00.888 real 0m2.993s 00:08:00.888 user 0m2.532s 00:08:00.888 sys 0m0.469s 00:08:00.888 22:36:45 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.888 22:36:45 thread -- common/autotest_common.sh@10 -- # set +x 00:08:00.888 ************************************ 00:08:00.888 END TEST thread 00:08:00.888 ************************************ 00:08:00.888 22:36:45 -- common/autotest_common.sh@1142 -- # return 0 00:08:00.888 22:36:45 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:00.888 22:36:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:00.888 22:36:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.888 22:36:45 -- common/autotest_common.sh@10 -- # set +x 00:08:00.888 ************************************ 00:08:00.888 START TEST accel 00:08:00.889 ************************************ 00:08:00.889 22:36:45 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:01.148 * Looking for test storage... 00:08:01.148 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:01.148 22:36:45 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:08:01.148 22:36:45 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:08:01.148 22:36:45 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:01.148 22:36:45 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2669404 00:08:01.148 22:36:45 accel -- accel/accel.sh@63 -- # waitforlisten 2669404 00:08:01.148 22:36:45 accel -- common/autotest_common.sh@829 -- # '[' -z 2669404 ']' 00:08:01.148 22:36:45 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:01.148 22:36:45 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:01.148 22:36:45 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:01.148 22:36:45 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:01.148 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:01.148 22:36:45 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:01.148 22:36:45 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:01.148 22:36:45 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:01.148 22:36:45 accel -- common/autotest_common.sh@10 -- # set +x 00:08:01.148 22:36:45 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:01.148 22:36:45 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:01.148 22:36:45 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:01.148 22:36:45 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:01.148 22:36:45 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:01.148 22:36:45 accel -- accel/accel.sh@41 -- # jq -r . 00:08:01.148 [2024-07-15 22:36:45.971256] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:01.148 [2024-07-15 22:36:45.971329] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2669404 ] 00:08:01.407 [2024-07-15 22:36:46.100726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.407 [2024-07-15 22:36:46.202941] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.345 22:36:46 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:02.345 22:36:46 accel -- common/autotest_common.sh@862 -- # return 0 00:08:02.345 22:36:46 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:02.345 22:36:46 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:02.345 22:36:46 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:02.345 22:36:46 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:08:02.345 22:36:46 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:02.345 22:36:46 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:02.345 22:36:46 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:02.345 22:36:46 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:02.345 22:36:46 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.345 22:36:46 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:02.345 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.345 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.345 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # IFS== 00:08:02.346 22:36:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:02.346 22:36:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:02.346 22:36:46 accel -- accel/accel.sh@75 -- # killprocess 2669404 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@948 -- # '[' -z 2669404 ']' 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@952 -- # kill -0 2669404 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@953 -- # uname 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2669404 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2669404' 00:08:02.346 killing process with pid 2669404 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@967 -- # kill 2669404 00:08:02.346 22:36:46 accel -- common/autotest_common.sh@972 -- # wait 2669404 00:08:02.606 22:36:47 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:02.606 22:36:47 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:08:02.606 22:36:47 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:02.606 22:36:47 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.606 22:36:47 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.606 22:36:47 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:08:02.606 22:36:47 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:08:02.606 22:36:47 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.606 22:36:47 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:08:02.606 22:36:47 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:02.606 22:36:47 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:02.606 22:36:47 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:02.606 22:36:47 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.606 22:36:47 accel -- common/autotest_common.sh@10 -- # set +x 00:08:02.606 ************************************ 00:08:02.606 START TEST accel_missing_filename 00:08:02.606 ************************************ 00:08:02.606 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:08:02.606 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:08:02.606 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:02.606 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:02.606 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.606 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:02.866 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:02.866 22:36:47 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:08:02.866 22:36:47 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:08:02.866 [2024-07-15 22:36:47.551021] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:02.866 [2024-07-15 22:36:47.551096] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2669627 ] 00:08:02.866 [2024-07-15 22:36:47.683474] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.125 [2024-07-15 22:36:47.789748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.125 [2024-07-15 22:36:47.859282] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:03.125 [2024-07-15 22:36:47.933619] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:08:03.125 A filename is required. 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:03.125 00:08:03.125 real 0m0.517s 00:08:03.125 user 0m0.351s 00:08:03.125 sys 0m0.197s 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.125 22:36:48 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:08:03.125 ************************************ 00:08:03.125 END TEST accel_missing_filename 00:08:03.125 ************************************ 00:08:03.385 22:36:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.385 22:36:48 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.385 22:36:48 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:03.385 22:36:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.385 22:36:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.385 ************************************ 00:08:03.385 START TEST accel_compress_verify 00:08:03.385 ************************************ 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:03.385 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:03.385 22:36:48 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:08:03.385 [2024-07-15 22:36:48.150435] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:03.385 [2024-07-15 22:36:48.150506] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2669658 ] 00:08:03.385 [2024-07-15 22:36:48.281080] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.644 [2024-07-15 22:36:48.388889] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.644 [2024-07-15 22:36:48.459422] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:03.644 [2024-07-15 22:36:48.533736] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:08:03.904 00:08:03.904 Compression does not support the verify option, aborting. 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:03.904 00:08:03.904 real 0m0.516s 00:08:03.904 user 0m0.342s 00:08:03.904 sys 0m0.200s 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.904 22:36:48 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:08:03.904 ************************************ 00:08:03.904 END TEST accel_compress_verify 00:08:03.904 ************************************ 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.904 22:36:48 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:03.904 ************************************ 00:08:03.904 START TEST accel_wrong_workload 00:08:03.904 ************************************ 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:08:03.904 22:36:48 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:08:03.904 Unsupported workload type: foobar 00:08:03.904 [2024-07-15 22:36:48.749569] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:03.904 accel_perf options: 00:08:03.904 [-h help message] 00:08:03.904 [-q queue depth per core] 00:08:03.904 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:03.904 [-T number of threads per core 00:08:03.904 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:03.904 [-t time in seconds] 00:08:03.904 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:03.904 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:03.904 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:03.904 [-l for compress/decompress workloads, name of uncompressed input file 00:08:03.904 [-S for crc32c workload, use this seed value (default 0) 00:08:03.904 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:03.904 [-f for fill workload, use this BYTE value (default 255) 00:08:03.904 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:03.904 [-y verify result if this switch is on] 00:08:03.904 [-a tasks to allocate per core (default: same value as -q)] 00:08:03.904 Can be used to spread operations across a wider range of memory. 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:03.904 00:08:03.904 real 0m0.043s 00:08:03.904 user 0m0.024s 00:08:03.904 sys 0m0.019s 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.904 22:36:48 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:08:03.904 ************************************ 00:08:03.904 END TEST accel_wrong_workload 00:08:03.904 ************************************ 00:08:03.904 Error: writing output failed: Broken pipe 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:03.904 22:36:48 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.904 22:36:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:04.164 ************************************ 00:08:04.165 START TEST accel_negative_buffers 00:08:04.165 ************************************ 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:08:04.165 22:36:48 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:08:04.165 -x option must be non-negative. 00:08:04.165 [2024-07-15 22:36:48.867780] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:04.165 accel_perf options: 00:08:04.165 [-h help message] 00:08:04.165 [-q queue depth per core] 00:08:04.165 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:04.165 [-T number of threads per core 00:08:04.165 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:04.165 [-t time in seconds] 00:08:04.165 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:04.165 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:04.165 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:04.165 [-l for compress/decompress workloads, name of uncompressed input file 00:08:04.165 [-S for crc32c workload, use this seed value (default 0) 00:08:04.165 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:04.165 [-f for fill workload, use this BYTE value (default 255) 00:08:04.165 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:04.165 [-y verify result if this switch is on] 00:08:04.165 [-a tasks to allocate per core (default: same value as -q)] 00:08:04.165 Can be used to spread operations across a wider range of memory. 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:04.165 00:08:04.165 real 0m0.041s 00:08:04.165 user 0m0.025s 00:08:04.165 sys 0m0.016s 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.165 22:36:48 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:08:04.165 ************************************ 00:08:04.165 END TEST accel_negative_buffers 00:08:04.165 ************************************ 00:08:04.165 Error: writing output failed: Broken pipe 00:08:04.165 22:36:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:04.165 22:36:48 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:04.165 22:36:48 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:04.165 22:36:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.165 22:36:48 accel -- common/autotest_common.sh@10 -- # set +x 00:08:04.165 ************************************ 00:08:04.165 START TEST accel_crc32c 00:08:04.165 ************************************ 00:08:04.165 22:36:48 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:04.165 22:36:48 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:04.165 [2024-07-15 22:36:48.994328] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:04.165 [2024-07-15 22:36:48.994394] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2669884 ] 00:08:04.426 [2024-07-15 22:36:49.124049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.426 [2024-07-15 22:36:49.228731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:04.426 22:36:49 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:05.805 22:36:50 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:05.805 00:08:05.805 real 0m1.511s 00:08:05.805 user 0m1.323s 00:08:05.805 sys 0m0.190s 00:08:05.805 22:36:50 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:05.805 22:36:50 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:05.805 ************************************ 00:08:05.805 END TEST accel_crc32c 00:08:05.805 ************************************ 00:08:05.805 22:36:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:05.805 22:36:50 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:05.805 22:36:50 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:05.805 22:36:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:05.805 22:36:50 accel -- common/autotest_common.sh@10 -- # set +x 00:08:05.805 ************************************ 00:08:05.805 START TEST accel_crc32c_C2 00:08:05.805 ************************************ 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:05.805 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:05.805 [2024-07-15 22:36:50.587905] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:05.805 [2024-07-15 22:36:50.587977] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2670077 ] 00:08:06.064 [2024-07-15 22:36:50.718435] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.064 [2024-07-15 22:36:50.828413] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:06.064 22:36:50 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:07.455 00:08:07.455 real 0m1.515s 00:08:07.455 user 0m1.320s 00:08:07.455 sys 0m0.200s 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:07.455 22:36:52 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:07.455 ************************************ 00:08:07.455 END TEST accel_crc32c_C2 00:08:07.455 ************************************ 00:08:07.455 22:36:52 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:07.455 22:36:52 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:07.455 22:36:52 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:07.455 22:36:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.455 22:36:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:07.455 ************************************ 00:08:07.455 START TEST accel_copy 00:08:07.455 ************************************ 00:08:07.455 22:36:52 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:07.455 22:36:52 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:08:07.455 [2024-07-15 22:36:52.189136] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:07.455 [2024-07-15 22:36:52.189205] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2670281 ] 00:08:07.455 [2024-07-15 22:36:52.318591] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.713 [2024-07-15 22:36:52.426983] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.713 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.714 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:07.714 22:36:52 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:07.714 22:36:52 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:07.714 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:07.714 22:36:52 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:08:09.089 22:36:53 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:09.089 00:08:09.089 real 0m1.519s 00:08:09.089 user 0m1.317s 00:08:09.089 sys 0m0.206s 00:08:09.089 22:36:53 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:09.089 22:36:53 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:08:09.089 ************************************ 00:08:09.089 END TEST accel_copy 00:08:09.089 ************************************ 00:08:09.089 22:36:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:09.089 22:36:53 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:09.089 22:36:53 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:09.089 22:36:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:09.089 22:36:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:09.089 ************************************ 00:08:09.089 START TEST accel_fill 00:08:09.089 ************************************ 00:08:09.089 22:36:53 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:08:09.089 22:36:53 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:08:09.089 [2024-07-15 22:36:53.795318] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:09.089 [2024-07-15 22:36:53.795386] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2670489 ] 00:08:09.089 [2024-07-15 22:36:53.926357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.348 [2024-07-15 22:36:54.030135] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:09.348 22:36:54 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:08:10.724 22:36:55 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:10.724 00:08:10.724 real 0m1.518s 00:08:10.724 user 0m1.311s 00:08:10.724 sys 0m0.205s 00:08:10.724 22:36:55 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.724 22:36:55 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:08:10.724 ************************************ 00:08:10.724 END TEST accel_fill 00:08:10.724 ************************************ 00:08:10.724 22:36:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:10.724 22:36:55 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:10.724 22:36:55 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:10.724 22:36:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.724 22:36:55 accel -- common/autotest_common.sh@10 -- # set +x 00:08:10.724 ************************************ 00:08:10.724 START TEST accel_copy_crc32c 00:08:10.724 ************************************ 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:10.724 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:10.724 [2024-07-15 22:36:55.393466] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:10.724 [2024-07-15 22:36:55.393528] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2670813 ] 00:08:10.724 [2024-07-15 22:36:55.521450] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.724 [2024-07-15 22:36:55.621986] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.983 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.983 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:10.984 22:36:55 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:12.364 00:08:12.364 real 0m1.512s 00:08:12.364 user 0m1.307s 00:08:12.364 sys 0m0.207s 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:12.364 22:36:56 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:12.364 ************************************ 00:08:12.364 END TEST accel_copy_crc32c 00:08:12.364 ************************************ 00:08:12.364 22:36:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:12.364 22:36:56 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:12.364 22:36:56 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:12.364 22:36:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:12.364 22:36:56 accel -- common/autotest_common.sh@10 -- # set +x 00:08:12.364 ************************************ 00:08:12.364 START TEST accel_copy_crc32c_C2 00:08:12.364 ************************************ 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:12.364 22:36:56 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:12.364 [2024-07-15 22:36:56.981452] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:12.364 [2024-07-15 22:36:56.981513] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671025 ] 00:08:12.364 [2024-07-15 22:36:57.110400] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.364 [2024-07-15 22:36:57.211663] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.622 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:12.623 22:36:57 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:13.560 00:08:13.560 real 0m1.508s 00:08:13.560 user 0m1.318s 00:08:13.560 sys 0m0.195s 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:13.560 22:36:58 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:13.560 ************************************ 00:08:13.560 END TEST accel_copy_crc32c_C2 00:08:13.560 ************************************ 00:08:13.820 22:36:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:13.820 22:36:58 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:13.820 22:36:58 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:13.820 22:36:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:13.820 22:36:58 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.820 ************************************ 00:08:13.820 START TEST accel_dualcast 00:08:13.820 ************************************ 00:08:13.820 22:36:58 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:08:13.820 22:36:58 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:08:13.820 [2024-07-15 22:36:58.574294] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:13.820 [2024-07-15 22:36:58.574357] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671229 ] 00:08:13.820 [2024-07-15 22:36:58.703349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.080 [2024-07-15 22:36:58.801247] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:14.080 22:36:58 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:15.459 22:37:00 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:15.460 22:37:00 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:08:15.460 22:37:00 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:15.460 00:08:15.460 real 0m1.488s 00:08:15.460 user 0m1.312s 00:08:15.460 sys 0m0.179s 00:08:15.460 22:37:00 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:15.460 22:37:00 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:08:15.460 ************************************ 00:08:15.460 END TEST accel_dualcast 00:08:15.460 ************************************ 00:08:15.460 22:37:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:15.460 22:37:00 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:15.460 22:37:00 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:15.460 22:37:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:15.460 22:37:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:15.460 ************************************ 00:08:15.460 START TEST accel_compare 00:08:15.460 ************************************ 00:08:15.460 22:37:00 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:08:15.460 22:37:00 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:08:15.460 [2024-07-15 22:37:00.132635] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:15.460 [2024-07-15 22:37:00.132693] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671423 ] 00:08:15.460 [2024-07-15 22:37:00.259883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.460 [2024-07-15 22:37:00.360938] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:15.719 22:37:00 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:08:17.098 22:37:01 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:17.098 00:08:17.098 real 0m1.501s 00:08:17.098 user 0m1.313s 00:08:17.098 sys 0m0.189s 00:08:17.098 22:37:01 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:17.098 22:37:01 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:08:17.098 ************************************ 00:08:17.098 END TEST accel_compare 00:08:17.098 ************************************ 00:08:17.098 22:37:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:17.098 22:37:01 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:17.098 22:37:01 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:17.098 22:37:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:17.098 22:37:01 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.098 ************************************ 00:08:17.098 START TEST accel_xor 00:08:17.098 ************************************ 00:08:17.098 22:37:01 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:17.098 22:37:01 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:17.098 [2024-07-15 22:37:01.713133] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:17.098 [2024-07-15 22:37:01.713193] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671624 ] 00:08:17.098 [2024-07-15 22:37:01.842891] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.098 [2024-07-15 22:37:01.944329] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:17.358 22:37:02 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:18.296 22:37:03 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:18.296 00:08:18.296 real 0m1.518s 00:08:18.296 user 0m1.325s 00:08:18.296 sys 0m0.190s 00:08:18.296 22:37:03 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:18.296 22:37:03 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:18.296 ************************************ 00:08:18.296 END TEST accel_xor 00:08:18.296 ************************************ 00:08:18.555 22:37:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:18.555 22:37:03 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:18.555 22:37:03 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:18.555 22:37:03 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:18.555 22:37:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:18.555 ************************************ 00:08:18.555 START TEST accel_xor 00:08:18.555 ************************************ 00:08:18.555 22:37:03 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:18.555 22:37:03 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:18.555 [2024-07-15 22:37:03.306970] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:18.555 [2024-07-15 22:37:03.307030] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671845 ] 00:08:18.555 [2024-07-15 22:37:03.435792] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.813 [2024-07-15 22:37:03.539145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:18.813 22:37:03 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:20.274 22:37:04 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:20.275 22:37:04 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:20.275 00:08:20.275 real 0m1.507s 00:08:20.275 user 0m1.314s 00:08:20.275 sys 0m0.199s 00:08:20.275 22:37:04 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:20.275 22:37:04 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:20.275 ************************************ 00:08:20.275 END TEST accel_xor 00:08:20.275 ************************************ 00:08:20.275 22:37:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:20.275 22:37:04 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:20.275 22:37:04 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:20.275 22:37:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:20.275 22:37:04 accel -- common/autotest_common.sh@10 -- # set +x 00:08:20.275 ************************************ 00:08:20.275 START TEST accel_dif_verify 00:08:20.275 ************************************ 00:08:20.275 22:37:04 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:20.275 22:37:04 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:08:20.275 [2024-07-15 22:37:04.918487] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:20.275 [2024-07-15 22:37:04.918631] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672159 ] 00:08:20.275 [2024-07-15 22:37:05.114140] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.534 [2024-07-15 22:37:05.218727] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.534 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:20.535 22:37:05 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:21.910 22:37:06 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:21.910 00:08:21.910 real 0m1.577s 00:08:21.910 user 0m1.328s 00:08:21.910 sys 0m0.251s 00:08:21.910 22:37:06 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:21.910 22:37:06 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:21.910 ************************************ 00:08:21.910 END TEST accel_dif_verify 00:08:21.910 ************************************ 00:08:21.910 22:37:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:21.910 22:37:06 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:21.910 22:37:06 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:21.910 22:37:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:21.910 22:37:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:21.910 ************************************ 00:08:21.910 START TEST accel_dif_generate 00:08:21.910 ************************************ 00:08:21.910 22:37:06 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:08:21.910 22:37:06 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:21.911 22:37:06 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:21.911 [2024-07-15 22:37:06.560052] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:21.911 [2024-07-15 22:37:06.560114] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672380 ] 00:08:21.911 [2024-07-15 22:37:06.688711] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.911 [2024-07-15 22:37:06.789574] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:22.169 22:37:06 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:23.545 22:37:08 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:23.545 00:08:23.545 real 0m1.502s 00:08:23.545 user 0m1.315s 00:08:23.545 sys 0m0.195s 00:08:23.545 22:37:08 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:23.545 22:37:08 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:23.545 ************************************ 00:08:23.545 END TEST accel_dif_generate 00:08:23.545 ************************************ 00:08:23.545 22:37:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:23.545 22:37:08 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:23.545 22:37:08 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:23.545 22:37:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:23.545 22:37:08 accel -- common/autotest_common.sh@10 -- # set +x 00:08:23.545 ************************************ 00:08:23.545 START TEST accel_dif_generate_copy 00:08:23.545 ************************************ 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:23.545 [2024-07-15 22:37:08.139737] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:23.545 [2024-07-15 22:37:08.139798] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672575 ] 00:08:23.545 [2024-07-15 22:37:08.280920] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.545 [2024-07-15 22:37:08.381859] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.545 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:23.804 22:37:08 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:24.739 00:08:24.739 real 0m1.525s 00:08:24.739 user 0m1.334s 00:08:24.739 sys 0m0.193s 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:24.739 22:37:09 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:24.739 ************************************ 00:08:24.739 END TEST accel_dif_generate_copy 00:08:24.739 ************************************ 00:08:24.998 22:37:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:24.998 22:37:09 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:24.998 22:37:09 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:24.998 22:37:09 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:24.998 22:37:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:24.998 22:37:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:24.998 ************************************ 00:08:24.998 START TEST accel_comp 00:08:24.998 ************************************ 00:08:24.998 22:37:09 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:24.998 22:37:09 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:24.998 [2024-07-15 22:37:09.743322] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:24.998 [2024-07-15 22:37:09.743384] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672774 ] 00:08:24.998 [2024-07-15 22:37:09.871287] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.256 [2024-07-15 22:37:09.973222] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:25.256 22:37:10 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:26.630 22:37:11 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:26.630 00:08:26.630 real 0m1.516s 00:08:26.630 user 0m1.311s 00:08:26.630 sys 0m0.205s 00:08:26.630 22:37:11 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:26.630 22:37:11 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:26.630 ************************************ 00:08:26.630 END TEST accel_comp 00:08:26.630 ************************************ 00:08:26.630 22:37:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:26.630 22:37:11 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:26.630 22:37:11 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:26.630 22:37:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:26.630 22:37:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:26.630 ************************************ 00:08:26.630 START TEST accel_decomp 00:08:26.630 ************************************ 00:08:26.630 22:37:11 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:26.630 22:37:11 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:26.630 [2024-07-15 22:37:11.342745] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:26.630 [2024-07-15 22:37:11.342807] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672970 ] 00:08:26.630 [2024-07-15 22:37:11.473789] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.889 [2024-07-15 22:37:11.580713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.889 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:26.890 22:37:11 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:28.266 22:37:12 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:28.266 00:08:28.266 real 0m1.517s 00:08:28.266 user 0m1.327s 00:08:28.266 sys 0m0.190s 00:08:28.266 22:37:12 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:28.266 22:37:12 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:28.266 ************************************ 00:08:28.266 END TEST accel_decomp 00:08:28.266 ************************************ 00:08:28.266 22:37:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:28.266 22:37:12 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:28.266 22:37:12 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:28.266 22:37:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:28.266 22:37:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:28.266 ************************************ 00:08:28.266 START TEST accel_decomp_full 00:08:28.266 ************************************ 00:08:28.266 22:37:12 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:28.266 22:37:12 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:28.266 [2024-07-15 22:37:12.945477] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:28.266 [2024-07-15 22:37:12.945539] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673267 ] 00:08:28.266 [2024-07-15 22:37:13.073367] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.266 [2024-07-15 22:37:13.170811] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:28.525 22:37:13 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.902 22:37:14 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:29.902 22:37:14 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:29.902 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:29.902 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.902 22:37:14 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:29.902 22:37:14 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:29.902 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:29.903 22:37:14 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:29.903 00:08:29.903 real 0m1.501s 00:08:29.903 user 0m1.313s 00:08:29.903 sys 0m0.193s 00:08:29.903 22:37:14 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.903 22:37:14 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:29.903 ************************************ 00:08:29.903 END TEST accel_decomp_full 00:08:29.903 ************************************ 00:08:29.903 22:37:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:29.903 22:37:14 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:29.903 22:37:14 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:29.903 22:37:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.903 22:37:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:29.903 ************************************ 00:08:29.903 START TEST accel_decomp_mcore 00:08:29.903 ************************************ 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:29.903 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:29.903 [2024-07-15 22:37:14.520582] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:29.903 [2024-07-15 22:37:14.520650] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673524 ] 00:08:29.903 [2024-07-15 22:37:14.648258] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:29.903 [2024-07-15 22:37:14.753273] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:29.903 [2024-07-15 22:37:14.753373] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:29.903 [2024-07-15 22:37:14.753476] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:29.903 [2024-07-15 22:37:14.753477] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.162 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.162 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.162 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.162 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.162 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.162 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.162 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:30.163 22:37:14 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.101 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:31.360 00:08:31.360 real 0m1.525s 00:08:31.360 user 0m4.778s 00:08:31.360 sys 0m0.203s 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.360 22:37:16 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:31.360 ************************************ 00:08:31.360 END TEST accel_decomp_mcore 00:08:31.360 ************************************ 00:08:31.360 22:37:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:31.360 22:37:16 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:31.360 22:37:16 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:31.360 22:37:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.360 22:37:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:31.360 ************************************ 00:08:31.360 START TEST accel_decomp_full_mcore 00:08:31.360 ************************************ 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:31.360 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:31.360 [2024-07-15 22:37:16.133945] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:31.360 [2024-07-15 22:37:16.134014] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673725 ] 00:08:31.360 [2024-07-15 22:37:16.268396] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:31.620 [2024-07-15 22:37:16.377342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:31.620 [2024-07-15 22:37:16.377444] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:31.620 [2024-07-15 22:37:16.377530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.620 [2024-07-15 22:37:16.377529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.620 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:31.621 22:37:16 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:33.000 00:08:33.000 real 0m1.565s 00:08:33.000 user 0m4.888s 00:08:33.000 sys 0m0.214s 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:33.000 22:37:17 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:33.000 ************************************ 00:08:33.000 END TEST accel_decomp_full_mcore 00:08:33.000 ************************************ 00:08:33.000 22:37:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:33.000 22:37:17 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:33.000 22:37:17 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:33.000 22:37:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.000 22:37:17 accel -- common/autotest_common.sh@10 -- # set +x 00:08:33.000 ************************************ 00:08:33.000 START TEST accel_decomp_mthread 00:08:33.000 ************************************ 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:33.000 22:37:17 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:33.000 [2024-07-15 22:37:17.782067] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:33.000 [2024-07-15 22:37:17.782129] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2673930 ] 00:08:33.000 [2024-07-15 22:37:17.899720] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.260 [2024-07-15 22:37:18.001014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:33.260 22:37:18 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:34.639 00:08:34.639 real 0m1.505s 00:08:34.639 user 0m1.319s 00:08:34.639 sys 0m0.191s 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.639 22:37:19 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:34.639 ************************************ 00:08:34.639 END TEST accel_decomp_mthread 00:08:34.639 ************************************ 00:08:34.639 22:37:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:34.639 22:37:19 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.639 22:37:19 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:34.639 22:37:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.639 22:37:19 accel -- common/autotest_common.sh@10 -- # set +x 00:08:34.639 ************************************ 00:08:34.639 START TEST accel_decomp_full_mthread 00:08:34.639 ************************************ 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:34.639 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:34.639 [2024-07-15 22:37:19.370974] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:34.639 [2024-07-15 22:37:19.371044] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2674121 ] 00:08:34.639 [2024-07-15 22:37:19.484478] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.898 [2024-07-15 22:37:19.590472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.898 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:34.899 22:37:19 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:36.277 00:08:36.277 real 0m1.530s 00:08:36.277 user 0m1.344s 00:08:36.277 sys 0m0.188s 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.277 22:37:20 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:36.277 ************************************ 00:08:36.277 END TEST accel_decomp_full_mthread 00:08:36.277 ************************************ 00:08:36.277 22:37:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:36.277 22:37:20 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:36.277 22:37:20 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:36.277 22:37:20 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:36.277 22:37:20 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:36.277 22:37:20 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2674338 00:08:36.277 22:37:20 accel -- accel/accel.sh@63 -- # waitforlisten 2674338 00:08:36.277 22:37:20 accel -- common/autotest_common.sh@829 -- # '[' -z 2674338 ']' 00:08:36.277 22:37:20 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:36.277 22:37:20 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:36.277 22:37:20 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:36.277 22:37:20 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:36.277 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:36.277 22:37:20 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:36.277 22:37:20 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:36.277 22:37:20 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.277 22:37:20 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:36.277 22:37:20 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:36.277 22:37:20 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.277 22:37:20 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.277 22:37:20 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:36.277 22:37:20 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:36.277 22:37:20 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:36.277 22:37:20 accel -- accel/accel.sh@41 -- # jq -r . 00:08:36.277 [2024-07-15 22:37:20.984988] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:36.277 [2024-07-15 22:37:20.985082] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2674338 ] 00:08:36.277 [2024-07-15 22:37:21.132332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.535 [2024-07-15 22:37:21.244145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.103 [2024-07-15 22:37:22.004650] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:37.362 22:37:22 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:37.362 22:37:22 accel -- common/autotest_common.sh@862 -- # return 0 00:08:37.362 22:37:22 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:37.362 22:37:22 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:37.362 22:37:22 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:37.362 22:37:22 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:37.362 22:37:22 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:37.362 22:37:22 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:37.362 22:37:22 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:37.362 22:37:22 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:37.362 22:37:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:37.362 22:37:22 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:37.621 22:37:22 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:37.621 "method": "compressdev_scan_accel_module", 00:08:37.621 22:37:22 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:37.621 22:37:22 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:37.621 22:37:22 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:37.621 22:37:22 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:37.621 22:37:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:37.621 22:37:22 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:37.621 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.621 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.621 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.621 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.621 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.621 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.621 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.621 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.621 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.621 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # IFS== 00:08:37.622 22:37:22 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:37.622 22:37:22 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:37.622 22:37:22 accel -- accel/accel.sh@75 -- # killprocess 2674338 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@948 -- # '[' -z 2674338 ']' 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@952 -- # kill -0 2674338 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@953 -- # uname 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2674338 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2674338' 00:08:37.622 killing process with pid 2674338 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@967 -- # kill 2674338 00:08:37.622 22:37:22 accel -- common/autotest_common.sh@972 -- # wait 2674338 00:08:38.190 22:37:22 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:38.190 22:37:22 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:38.190 22:37:22 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:38.190 22:37:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:38.190 22:37:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:38.190 ************************************ 00:08:38.190 START TEST accel_cdev_comp 00:08:38.190 ************************************ 00:08:38.190 22:37:22 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:38.190 22:37:22 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:38.190 [2024-07-15 22:37:22.918305] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:38.190 [2024-07-15 22:37:22.918362] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2674680 ] 00:08:38.190 [2024-07-15 22:37:23.044242] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.449 [2024-07-15 22:37:23.145146] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.098 [2024-07-15 22:37:23.902531] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:39.098 [2024-07-15 22:37:23.905091] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xda9080 PMD being used: compress_qat 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 [2024-07-15 22:37:23.909105] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xdade60 PMD being used: compress_qat 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:39.098 22:37:23 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:40.476 22:37:25 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:40.476 00:08:40.476 real 0m2.203s 00:08:40.476 user 0m1.660s 00:08:40.476 sys 0m0.546s 00:08:40.476 22:37:25 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:40.476 22:37:25 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:40.476 ************************************ 00:08:40.476 END TEST accel_cdev_comp 00:08:40.476 ************************************ 00:08:40.476 22:37:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:40.476 22:37:25 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:40.476 22:37:25 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:40.476 22:37:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:40.476 22:37:25 accel -- common/autotest_common.sh@10 -- # set +x 00:08:40.476 ************************************ 00:08:40.476 START TEST accel_cdev_decomp 00:08:40.476 ************************************ 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:40.476 22:37:25 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:40.476 [2024-07-15 22:37:25.210725] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:40.476 [2024-07-15 22:37:25.210788] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2674993 ] 00:08:40.476 [2024-07-15 22:37:25.340029] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.735 [2024-07-15 22:37:25.439717] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.302 [2024-07-15 22:37:26.207162] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:41.302 [2024-07-15 22:37:26.209787] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2689080 PMD being used: compress_qat 00:08:41.302 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 [2024-07-15 22:37:26.213987] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x268de60 PMD being used: compress_qat 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.561 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:41.562 22:37:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:42.499 00:08:42.499 real 0m2.219s 00:08:42.499 user 0m1.655s 00:08:42.499 sys 0m0.565s 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:42.499 22:37:27 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:42.499 ************************************ 00:08:42.499 END TEST accel_cdev_decomp 00:08:42.499 ************************************ 00:08:42.758 22:37:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:42.758 22:37:27 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:42.758 22:37:27 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:42.758 22:37:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.758 22:37:27 accel -- common/autotest_common.sh@10 -- # set +x 00:08:42.758 ************************************ 00:08:42.758 START TEST accel_cdev_decomp_full 00:08:42.758 ************************************ 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.758 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:42.759 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:42.759 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:42.759 22:37:27 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:42.759 [2024-07-15 22:37:27.517660] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:42.759 [2024-07-15 22:37:27.517721] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2675251 ] 00:08:42.759 [2024-07-15 22:37:27.648113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.017 [2024-07-15 22:37:27.753339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.954 [2024-07-15 22:37:28.548107] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:43.954 [2024-07-15 22:37:28.550712] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1868080 PMD being used: compress_qat 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.954 [2024-07-15 22:37:28.554084] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1867ce0 PMD being used: compress_qat 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.954 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:43.955 22:37:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:44.892 00:08:44.892 real 0m2.254s 00:08:44.892 user 0m1.672s 00:08:44.892 sys 0m0.573s 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:44.892 22:37:29 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:44.892 ************************************ 00:08:44.892 END TEST accel_cdev_decomp_full 00:08:44.892 ************************************ 00:08:44.892 22:37:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:44.892 22:37:29 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:44.892 22:37:29 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:44.892 22:37:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.892 22:37:29 accel -- common/autotest_common.sh@10 -- # set +x 00:08:45.152 ************************************ 00:08:45.152 START TEST accel_cdev_decomp_mcore 00:08:45.152 ************************************ 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:45.152 22:37:29 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:45.152 [2024-07-15 22:37:29.855584] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:45.152 [2024-07-15 22:37:29.855651] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2675615 ] 00:08:45.152 [2024-07-15 22:37:29.976074] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:45.411 [2024-07-15 22:37:30.089453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:45.411 [2024-07-15 22:37:30.089555] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:45.411 [2024-07-15 22:37:30.089654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:45.411 [2024-07-15 22:37:30.089654] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.977 [2024-07-15 22:37:30.859786] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:45.977 [2024-07-15 22:37:30.862415] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x9c2720 PMD being used: compress_qat 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:45.977 [2024-07-15 22:37:30.868414] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff20819b8b0 PMD being used: compress_qat 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.977 [2024-07-15 22:37:30.870237] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x9c79f0 PMD being used: compress_qat 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 [2024-07-15 22:37:30.873146] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff20019b8b0 PMD being used: compress_qat 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.977 [2024-07-15 22:37:30.873462] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff1f819b8b0 PMD being used: compress_qat 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.977 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:45.978 22:37:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:47.348 00:08:47.348 real 0m2.250s 00:08:47.348 user 0m7.257s 00:08:47.348 sys 0m0.597s 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:47.348 22:37:32 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:47.348 ************************************ 00:08:47.348 END TEST accel_cdev_decomp_mcore 00:08:47.348 ************************************ 00:08:47.348 22:37:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:47.348 22:37:32 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:47.348 22:37:32 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:47.348 22:37:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.348 22:37:32 accel -- common/autotest_common.sh@10 -- # set +x 00:08:47.348 ************************************ 00:08:47.348 START TEST accel_cdev_decomp_full_mcore 00:08:47.348 ************************************ 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:47.348 22:37:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:47.348 [2024-07-15 22:37:32.190086] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:47.348 [2024-07-15 22:37:32.190147] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2675986 ] 00:08:47.606 [2024-07-15 22:37:32.320401] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:47.606 [2024-07-15 22:37:32.427147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:47.606 [2024-07-15 22:37:32.427248] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:47.606 [2024-07-15 22:37:32.427349] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:47.606 [2024-07-15 22:37:32.427350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.540 [2024-07-15 22:37:33.193823] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:48.540 [2024-07-15 22:37:33.196462] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ed4720 PMD being used: compress_qat 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:48.540 [2024-07-15 22:37:33.201605] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f37c819b8b0 PMD being used: compress_qat 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 [2024-07-15 22:37:33.203570] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ed7a30 PMD being used: compress_qat 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 [2024-07-15 22:37:33.206649] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f37c019b8b0 PMD being used: compress_qat 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 [2024-07-15 22:37:33.206909] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f37b819b8b0 PMD being used: compress_qat 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:48.540 22:37:33 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:49.915 00:08:49.915 real 0m2.249s 00:08:49.915 user 0m7.222s 00:08:49.915 sys 0m0.624s 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:49.915 22:37:34 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:49.915 ************************************ 00:08:49.915 END TEST accel_cdev_decomp_full_mcore 00:08:49.915 ************************************ 00:08:49.915 22:37:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:49.915 22:37:34 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:49.915 22:37:34 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:08:49.915 22:37:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:49.915 22:37:34 accel -- common/autotest_common.sh@10 -- # set +x 00:08:49.915 ************************************ 00:08:49.915 START TEST accel_cdev_decomp_mthread 00:08:49.915 ************************************ 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:49.915 22:37:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:49.915 [2024-07-15 22:37:34.519198] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:49.915 [2024-07-15 22:37:34.519259] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2676208 ] 00:08:49.915 [2024-07-15 22:37:34.650989] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.915 [2024-07-15 22:37:34.753798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.853 [2024-07-15 22:37:35.515522] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:50.853 [2024-07-15 22:37:35.518095] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf70080 PMD being used: compress_qat 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.853 [2024-07-15 22:37:35.523040] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf752a0 PMD being used: compress_qat 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 [2024-07-15 22:37:35.525589] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10980f0 PMD being used: compress_qat 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.853 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:50.854 22:37:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:51.790 00:08:51.790 real 0m2.211s 00:08:51.790 user 0m1.634s 00:08:51.790 sys 0m0.576s 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:51.790 22:37:36 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:51.790 ************************************ 00:08:51.790 END TEST accel_cdev_decomp_mthread 00:08:51.790 ************************************ 00:08:52.049 22:37:36 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:52.049 22:37:36 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:52.049 22:37:36 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:52.049 22:37:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:52.049 22:37:36 accel -- common/autotest_common.sh@10 -- # set +x 00:08:52.049 ************************************ 00:08:52.049 START TEST accel_cdev_decomp_full_mthread 00:08:52.049 ************************************ 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:52.049 22:37:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:52.049 [2024-07-15 22:37:36.812754] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:52.049 [2024-07-15 22:37:36.812820] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2676560 ] 00:08:52.049 [2024-07-15 22:37:36.942538] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.308 [2024-07-15 22:37:37.047753] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.243 [2024-07-15 22:37:37.808380] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:53.243 [2024-07-15 22:37:37.810943] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xdcb080 PMD being used: compress_qat 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.243 [2024-07-15 22:37:37.815076] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xdce3b0 PMD being used: compress_qat 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.243 [2024-07-15 22:37:37.817923] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xef2cc0 PMD being used: compress_qat 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:53.243 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.244 22:37:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:38 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:54.178 00:08:54.178 real 0m2.229s 00:08:54.178 user 0m1.662s 00:08:54.178 sys 0m0.571s 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:54.178 22:37:39 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:54.178 ************************************ 00:08:54.178 END TEST accel_cdev_decomp_full_mthread 00:08:54.178 ************************************ 00:08:54.178 22:37:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:54.178 22:37:39 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:54.178 22:37:39 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:54.178 22:37:39 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:54.178 22:37:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:54.178 22:37:39 accel -- common/autotest_common.sh@10 -- # set +x 00:08:54.178 22:37:39 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:54.178 22:37:39 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:54.178 22:37:39 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:54.178 22:37:39 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:54.178 22:37:39 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:54.178 22:37:39 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:54.178 22:37:39 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:54.178 22:37:39 accel -- accel/accel.sh@41 -- # jq -r . 00:08:54.178 ************************************ 00:08:54.178 START TEST accel_dif_functional_tests 00:08:54.178 ************************************ 00:08:54.178 22:37:39 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:54.436 [2024-07-15 22:37:39.143254] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:54.436 [2024-07-15 22:37:39.143316] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2676930 ] 00:08:54.436 [2024-07-15 22:37:39.270346] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:54.694 [2024-07-15 22:37:39.376603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:54.694 [2024-07-15 22:37:39.376704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:54.694 [2024-07-15 22:37:39.376707] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.694 00:08:54.694 00:08:54.694 CUnit - A unit testing framework for C - Version 2.1-3 00:08:54.694 http://cunit.sourceforge.net/ 00:08:54.694 00:08:54.694 00:08:54.694 Suite: accel_dif 00:08:54.694 Test: verify: DIF generated, GUARD check ...passed 00:08:54.694 Test: verify: DIF generated, APPTAG check ...passed 00:08:54.694 Test: verify: DIF generated, REFTAG check ...passed 00:08:54.694 Test: verify: DIF not generated, GUARD check ...[2024-07-15 22:37:39.472459] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:54.694 passed 00:08:54.694 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 22:37:39.472540] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:54.694 passed 00:08:54.694 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 22:37:39.472577] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:54.694 passed 00:08:54.694 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:54.694 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 22:37:39.472653] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:54.694 passed 00:08:54.694 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:54.694 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:54.694 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:54.694 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 22:37:39.472821] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:54.694 passed 00:08:54.694 Test: verify copy: DIF generated, GUARD check ...passed 00:08:54.694 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:54.694 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:54.694 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 22:37:39.473008] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:54.694 passed 00:08:54.694 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 22:37:39.473047] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:54.694 passed 00:08:54.694 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 22:37:39.473092] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:54.694 passed 00:08:54.694 Test: generate copy: DIF generated, GUARD check ...passed 00:08:54.694 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:54.694 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:54.694 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:54.694 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:54.694 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:54.694 Test: generate copy: iovecs-len validate ...[2024-07-15 22:37:39.473374] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:54.694 passed 00:08:54.694 Test: generate copy: buffer alignment validate ...passed 00:08:54.694 00:08:54.694 Run Summary: Type Total Ran Passed Failed Inactive 00:08:54.694 suites 1 1 n/a 0 0 00:08:54.694 tests 26 26 26 0 0 00:08:54.694 asserts 115 115 115 0 n/a 00:08:54.694 00:08:54.694 Elapsed time = 0.003 seconds 00:08:54.954 00:08:54.954 real 0m0.608s 00:08:54.954 user 0m0.855s 00:08:54.954 sys 0m0.224s 00:08:54.954 22:37:39 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:54.954 22:37:39 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:54.954 ************************************ 00:08:54.954 END TEST accel_dif_functional_tests 00:08:54.954 ************************************ 00:08:54.954 22:37:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:54.954 00:08:54.954 real 0m53.938s 00:08:54.954 user 1m2.217s 00:08:54.954 sys 0m12.071s 00:08:54.954 22:37:39 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:54.954 22:37:39 accel -- common/autotest_common.sh@10 -- # set +x 00:08:54.954 ************************************ 00:08:54.954 END TEST accel 00:08:54.954 ************************************ 00:08:54.954 22:37:39 -- common/autotest_common.sh@1142 -- # return 0 00:08:54.954 22:37:39 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:54.954 22:37:39 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:54.954 22:37:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:54.954 22:37:39 -- common/autotest_common.sh@10 -- # set +x 00:08:54.954 ************************************ 00:08:54.954 START TEST accel_rpc 00:08:54.954 ************************************ 00:08:54.954 22:37:39 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:55.212 * Looking for test storage... 00:08:55.212 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:55.212 22:37:39 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:55.212 22:37:39 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2676999 00:08:55.212 22:37:39 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 2676999 00:08:55.212 22:37:39 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:55.212 22:37:39 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 2676999 ']' 00:08:55.212 22:37:39 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:55.212 22:37:39 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:55.212 22:37:39 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:55.212 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:55.212 22:37:39 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:55.212 22:37:39 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:55.212 [2024-07-15 22:37:40.000425] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:55.212 [2024-07-15 22:37:40.000501] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2676999 ] 00:08:55.470 [2024-07-15 22:37:40.129246] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.470 [2024-07-15 22:37:40.234170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.407 22:37:41 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:56.407 22:37:41 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:56.407 22:37:41 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:56.407 22:37:41 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:56.407 22:37:41 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:56.407 22:37:41 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:56.407 22:37:41 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:56.407 22:37:41 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:56.407 22:37:41 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:56.407 22:37:41 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:56.407 ************************************ 00:08:56.407 START TEST accel_assign_opcode 00:08:56.407 ************************************ 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:56.407 [2024-07-15 22:37:41.249227] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:56.407 [2024-07-15 22:37:41.257241] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:56.407 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:56.666 software 00:08:56.666 00:08:56.666 real 0m0.317s 00:08:56.666 user 0m0.053s 00:08:56.666 sys 0m0.011s 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:56.666 22:37:41 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:56.666 ************************************ 00:08:56.666 END TEST accel_assign_opcode 00:08:56.666 ************************************ 00:08:56.924 22:37:41 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:56.924 22:37:41 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 2676999 00:08:56.924 22:37:41 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 2676999 ']' 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 2676999 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2676999 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2676999' 00:08:56.925 killing process with pid 2676999 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@967 -- # kill 2676999 00:08:56.925 22:37:41 accel_rpc -- common/autotest_common.sh@972 -- # wait 2676999 00:08:57.184 00:08:57.184 real 0m2.172s 00:08:57.184 user 0m2.461s 00:08:57.184 sys 0m0.635s 00:08:57.184 22:37:41 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:57.184 22:37:41 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:57.184 ************************************ 00:08:57.184 END TEST accel_rpc 00:08:57.184 ************************************ 00:08:57.184 22:37:42 -- common/autotest_common.sh@1142 -- # return 0 00:08:57.184 22:37:42 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:57.184 22:37:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:57.184 22:37:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:57.184 22:37:42 -- common/autotest_common.sh@10 -- # set +x 00:08:57.184 ************************************ 00:08:57.184 START TEST app_cmdline 00:08:57.184 ************************************ 00:08:57.184 22:37:42 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:57.446 * Looking for test storage... 00:08:57.446 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:57.446 22:37:42 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:57.446 22:37:42 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=2677420 00:08:57.446 22:37:42 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 2677420 00:08:57.446 22:37:42 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:57.446 22:37:42 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 2677420 ']' 00:08:57.446 22:37:42 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:57.446 22:37:42 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:57.446 22:37:42 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:57.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:57.446 22:37:42 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:57.446 22:37:42 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:57.446 [2024-07-15 22:37:42.258522] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:57.446 [2024-07-15 22:37:42.258600] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2677420 ] 00:08:57.704 [2024-07-15 22:37:42.389203] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.705 [2024-07-15 22:37:42.496757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.272 22:37:43 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:58.272 22:37:43 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:08:58.272 22:37:43 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:08:58.531 { 00:08:58.531 "version": "SPDK v24.09-pre git sha1 4903ec649", 00:08:58.531 "fields": { 00:08:58.531 "major": 24, 00:08:58.531 "minor": 9, 00:08:58.531 "patch": 0, 00:08:58.531 "suffix": "-pre", 00:08:58.531 "commit": "4903ec649" 00:08:58.531 } 00:08:58.531 } 00:08:58.531 22:37:43 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:08:58.531 22:37:43 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:08:58.531 22:37:43 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:08:58.531 22:37:43 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:08:58.531 22:37:43 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:08:58.531 22:37:43 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:08:58.531 22:37:43 app_cmdline -- app/cmdline.sh@26 -- # sort 00:08:58.531 22:37:43 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:58.531 22:37:43 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:58.532 22:37:43 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:08:58.532 22:37:43 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:08:58.532 22:37:43 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:08:58.532 22:37:43 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:08:58.791 request: 00:08:58.791 { 00:08:58.791 "method": "env_dpdk_get_mem_stats", 00:08:58.791 "req_id": 1 00:08:58.791 } 00:08:58.791 Got JSON-RPC error response 00:08:58.791 response: 00:08:58.791 { 00:08:58.791 "code": -32601, 00:08:58.791 "message": "Method not found" 00:08:58.791 } 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:58.791 22:37:43 app_cmdline -- app/cmdline.sh@1 -- # killprocess 2677420 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 2677420 ']' 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 2677420 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2677420 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2677420' 00:08:58.791 killing process with pid 2677420 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@967 -- # kill 2677420 00:08:58.791 22:37:43 app_cmdline -- common/autotest_common.sh@972 -- # wait 2677420 00:08:59.357 00:08:59.357 real 0m2.008s 00:08:59.357 user 0m2.350s 00:08:59.357 sys 0m0.630s 00:08:59.357 22:37:44 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:59.357 22:37:44 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:59.357 ************************************ 00:08:59.357 END TEST app_cmdline 00:08:59.357 ************************************ 00:08:59.357 22:37:44 -- common/autotest_common.sh@1142 -- # return 0 00:08:59.357 22:37:44 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:59.357 22:37:44 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:59.357 22:37:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:59.357 22:37:44 -- common/autotest_common.sh@10 -- # set +x 00:08:59.357 ************************************ 00:08:59.357 START TEST version 00:08:59.357 ************************************ 00:08:59.357 22:37:44 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:08:59.617 * Looking for test storage... 00:08:59.617 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:59.617 22:37:44 version -- app/version.sh@17 -- # get_header_version major 00:08:59.617 22:37:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # tr -d '"' 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # cut -f2 00:08:59.617 22:37:44 version -- app/version.sh@17 -- # major=24 00:08:59.617 22:37:44 version -- app/version.sh@18 -- # get_header_version minor 00:08:59.617 22:37:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # cut -f2 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # tr -d '"' 00:08:59.617 22:37:44 version -- app/version.sh@18 -- # minor=9 00:08:59.617 22:37:44 version -- app/version.sh@19 -- # get_header_version patch 00:08:59.617 22:37:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # cut -f2 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # tr -d '"' 00:08:59.617 22:37:44 version -- app/version.sh@19 -- # patch=0 00:08:59.617 22:37:44 version -- app/version.sh@20 -- # get_header_version suffix 00:08:59.617 22:37:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # cut -f2 00:08:59.617 22:37:44 version -- app/version.sh@14 -- # tr -d '"' 00:08:59.617 22:37:44 version -- app/version.sh@20 -- # suffix=-pre 00:08:59.617 22:37:44 version -- app/version.sh@22 -- # version=24.9 00:08:59.617 22:37:44 version -- app/version.sh@25 -- # (( patch != 0 )) 00:08:59.617 22:37:44 version -- app/version.sh@28 -- # version=24.9rc0 00:08:59.617 22:37:44 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:08:59.617 22:37:44 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:08:59.617 22:37:44 version -- app/version.sh@30 -- # py_version=24.9rc0 00:08:59.617 22:37:44 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:08:59.617 00:08:59.617 real 0m0.189s 00:08:59.617 user 0m0.099s 00:08:59.617 sys 0m0.137s 00:08:59.617 22:37:44 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:59.617 22:37:44 version -- common/autotest_common.sh@10 -- # set +x 00:08:59.617 ************************************ 00:08:59.617 END TEST version 00:08:59.617 ************************************ 00:08:59.617 22:37:44 -- common/autotest_common.sh@1142 -- # return 0 00:08:59.617 22:37:44 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:08:59.617 22:37:44 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:59.617 22:37:44 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:59.617 22:37:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:59.617 22:37:44 -- common/autotest_common.sh@10 -- # set +x 00:08:59.617 ************************************ 00:08:59.617 START TEST blockdev_general 00:08:59.617 ************************************ 00:08:59.617 22:37:44 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:08:59.876 * Looking for test storage... 00:08:59.876 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:59.876 22:37:44 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:59.876 22:37:44 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:08:59.876 22:37:44 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:08:59.876 22:37:44 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:59.876 22:37:44 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2677888 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 2677888 00:08:59.877 22:37:44 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:08:59.877 22:37:44 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 2677888 ']' 00:08:59.877 22:37:44 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:59.877 22:37:44 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:59.877 22:37:44 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:59.877 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:59.877 22:37:44 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:59.877 22:37:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:59.877 [2024-07-15 22:37:44.638598] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:08:59.877 [2024-07-15 22:37:44.638680] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2677888 ] 00:08:59.877 [2024-07-15 22:37:44.769414] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.136 [2024-07-15 22:37:44.874582] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.704 22:37:45 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:00.704 22:37:45 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:09:00.704 22:37:45 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:09:00.704 22:37:45 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:09:00.704 22:37:45 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:00.704 22:37:45 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:00.704 22:37:45 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:00.963 [2024-07-15 22:37:45.768998] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:00.963 [2024-07-15 22:37:45.769055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:00.963 00:09:00.963 [2024-07-15 22:37:45.776975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:00.963 [2024-07-15 22:37:45.777000] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:00.963 00:09:00.963 Malloc0 00:09:00.963 Malloc1 00:09:00.963 Malloc2 00:09:00.963 Malloc3 00:09:00.963 Malloc4 00:09:00.963 Malloc5 00:09:01.221 Malloc6 00:09:01.221 Malloc7 00:09:01.221 Malloc8 00:09:01.221 Malloc9 00:09:01.221 [2024-07-15 22:37:45.925696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:01.221 [2024-07-15 22:37:45.925748] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:01.221 [2024-07-15 22:37:45.925769] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10d4350 00:09:01.221 [2024-07-15 22:37:45.925781] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:01.221 [2024-07-15 22:37:45.927146] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:01.221 [2024-07-15 22:37:45.927183] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:01.221 TestPT 00:09:01.221 22:37:45 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.221 22:37:45 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:01.221 5000+0 records in 00:09:01.221 5000+0 records out 00:09:01.221 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0377845 s, 271 MB/s 00:09:01.221 22:37:46 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.221 AIO0 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.221 22:37:46 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.221 22:37:46 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:09:01.221 22:37:46 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.221 22:37:46 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.221 22:37:46 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.221 22:37:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.481 22:37:46 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.481 22:37:46 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:09:01.481 22:37:46 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:09:01.481 22:37:46 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:09:01.481 22:37:46 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:01.481 22:37:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.481 22:37:46 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:01.481 22:37:46 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:09:01.481 22:37:46 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:09:01.482 22:37:46 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "543f3082-5cd5-4b98-8b6c-5d095d9e3758"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "543f3082-5cd5-4b98-8b6c-5d095d9e3758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "70764dc3-7b4f-5bb1-89ea-a634ff063d61"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "70764dc3-7b4f-5bb1-89ea-a634ff063d61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "6927536a-d231-5a65-92a8-5204d93cabcb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6927536a-d231-5a65-92a8-5204d93cabcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "7e1fe307-5b41-5406-b965-29cfe4d48a2b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7e1fe307-5b41-5406-b965-29cfe4d48a2b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "b4fd3a36-da61-57e1-b2a8-1243a4218e3b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b4fd3a36-da61-57e1-b2a8-1243a4218e3b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "afabbe89-bc87-59ab-8201-ef979863ab26"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "afabbe89-bc87-59ab-8201-ef979863ab26",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ea0c27f7-c2be-5d23-8ad3-87ec4ccf3f53"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ea0c27f7-c2be-5d23-8ad3-87ec4ccf3f53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "351128e7-b0e4-59a2-89b5-ea7cd0d20cfe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "351128e7-b0e4-59a2-89b5-ea7cd0d20cfe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "fb9edc61-11c1-5e8d-901e-379d01920b03"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fb9edc61-11c1-5e8d-901e-379d01920b03",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "a107930e-ef5b-5861-b75a-0fe27d1226c8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a107930e-ef5b-5861-b75a-0fe27d1226c8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "85ebfcb3-e4d9-594d-929f-da5fa32367f8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "85ebfcb3-e4d9-594d-929f-da5fa32367f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "09d72fc6-6518-5e3b-bd48-b72dc8ade0e3"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "09d72fc6-6518-5e3b-bd48-b72dc8ade0e3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "880aa874-df25-4892-98d1-76ab48417e59"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "880aa874-df25-4892-98d1-76ab48417e59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "880aa874-df25-4892-98d1-76ab48417e59",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4b331d25-7e68-4fcf-bdc4-16f072e5ef88",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d5affa4b-0bdf-46bb-b830-9bf57f54a9af",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "95b0c80b-1380-490d-a65a-d7a8f4841174"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "95b0c80b-1380-490d-a65a-d7a8f4841174",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "95b0c80b-1380-490d-a65a-d7a8f4841174",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "e536e026-72f2-4005-8950-c4a1a3ae840f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6bad086f-72e1-4a3d-9e0a-986c6b8670c1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "205d68f5-e544-49ed-8771-fdc304e600ba"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "205d68f5-e544-49ed-8771-fdc304e600ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "205d68f5-e544-49ed-8771-fdc304e600ba",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "172ff4d5-43ab-4167-be90-5f71c758713b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "b7cdca5c-c5e6-4077-9283-a24ff05f66e5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "1e2758cf-cb5c-4c54-98f4-2720fcb49b24"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "1e2758cf-cb5c-4c54-98f4-2720fcb49b24",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:01.741 22:37:46 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:09:01.741 22:37:46 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:09:01.741 22:37:46 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:09:01.741 22:37:46 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 2677888 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 2677888 ']' 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 2677888 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2677888 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2677888' 00:09:01.741 killing process with pid 2677888 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@967 -- # kill 2677888 00:09:01.741 22:37:46 blockdev_general -- common/autotest_common.sh@972 -- # wait 2677888 00:09:02.308 22:37:46 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:02.308 22:37:46 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:02.308 22:37:46 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:02.308 22:37:46 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:02.308 22:37:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.308 ************************************ 00:09:02.308 START TEST bdev_hello_world 00:09:02.308 ************************************ 00:09:02.308 22:37:47 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:02.308 [2024-07-15 22:37:47.075623] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:09:02.308 [2024-07-15 22:37:47.075694] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2678223 ] 00:09:02.308 [2024-07-15 22:37:47.208108] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.567 [2024-07-15 22:37:47.313333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.567 [2024-07-15 22:37:47.463484] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:02.567 [2024-07-15 22:37:47.463541] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:02.567 [2024-07-15 22:37:47.463556] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:02.567 [2024-07-15 22:37:47.471486] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.567 [2024-07-15 22:37:47.471513] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.825 [2024-07-15 22:37:47.479497] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.825 [2024-07-15 22:37:47.479520] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.825 [2024-07-15 22:37:47.551888] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:02.825 [2024-07-15 22:37:47.551950] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:02.825 [2024-07-15 22:37:47.551970] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eda3c0 00:09:02.825 [2024-07-15 22:37:47.551982] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:02.825 [2024-07-15 22:37:47.553391] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:02.825 [2024-07-15 22:37:47.553420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:02.825 [2024-07-15 22:37:47.693385] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:02.825 [2024-07-15 22:37:47.693455] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:02.825 [2024-07-15 22:37:47.693510] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:02.825 [2024-07-15 22:37:47.693588] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:02.825 [2024-07-15 22:37:47.693664] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:02.825 [2024-07-15 22:37:47.693694] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:02.825 [2024-07-15 22:37:47.693759] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:02.826 00:09:02.826 [2024-07-15 22:37:47.693811] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:03.392 00:09:03.392 real 0m1.024s 00:09:03.392 user 0m0.690s 00:09:03.392 sys 0m0.301s 00:09:03.392 22:37:48 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:03.392 22:37:48 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:03.392 ************************************ 00:09:03.392 END TEST bdev_hello_world 00:09:03.392 ************************************ 00:09:03.392 22:37:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:03.392 22:37:48 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:09:03.392 22:37:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:03.392 22:37:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:03.392 22:37:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:03.392 ************************************ 00:09:03.392 START TEST bdev_bounds 00:09:03.392 ************************************ 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2678301 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2678301' 00:09:03.392 Process bdevio pid: 2678301 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2678301 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2678301 ']' 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:03.392 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:03.392 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:03.392 [2024-07-15 22:37:48.188322] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:09:03.392 [2024-07-15 22:37:48.188393] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2678301 ] 00:09:03.650 [2024-07-15 22:37:48.321352] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:03.650 [2024-07-15 22:37:48.436549] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:03.650 [2024-07-15 22:37:48.437977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:03.650 [2024-07-15 22:37:48.437979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.909 [2024-07-15 22:37:48.588201] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:03.909 [2024-07-15 22:37:48.588267] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:03.909 [2024-07-15 22:37:48.588282] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:03.909 [2024-07-15 22:37:48.596207] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:03.909 [2024-07-15 22:37:48.596234] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:03.909 [2024-07-15 22:37:48.604219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:03.909 [2024-07-15 22:37:48.604243] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:03.909 [2024-07-15 22:37:48.679208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:03.909 [2024-07-15 22:37:48.679268] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:03.909 [2024-07-15 22:37:48.679287] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101b0c0 00:09:03.909 [2024-07-15 22:37:48.679300] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:03.909 [2024-07-15 22:37:48.680760] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:03.909 [2024-07-15 22:37:48.680788] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:04.168 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:04.168 22:37:48 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:09:04.168 22:37:48 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:04.168 I/O targets: 00:09:04.168 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:04.168 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:04.168 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:04.168 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:04.168 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:04.168 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:04.168 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:04.168 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:04.168 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:04.168 00:09:04.168 00:09:04.168 CUnit - A unit testing framework for C - Version 2.1-3 00:09:04.168 http://cunit.sourceforge.net/ 00:09:04.168 00:09:04.168 00:09:04.168 Suite: bdevio tests on: AIO0 00:09:04.168 Test: blockdev write read block ...passed 00:09:04.168 Test: blockdev write zeroes read block ...passed 00:09:04.168 Test: blockdev write zeroes read no split ...passed 00:09:04.168 Test: blockdev write zeroes read split ...passed 00:09:04.168 Test: blockdev write zeroes read split partial ...passed 00:09:04.168 Test: blockdev reset ...passed 00:09:04.168 Test: blockdev write read 8 blocks ...passed 00:09:04.168 Test: blockdev write read size > 128k ...passed 00:09:04.168 Test: blockdev write read invalid size ...passed 00:09:04.168 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.168 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.168 Test: blockdev write read max offset ...passed 00:09:04.168 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.168 Test: blockdev writev readv 8 blocks ...passed 00:09:04.168 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.168 Test: blockdev writev readv block ...passed 00:09:04.168 Test: blockdev writev readv size > 128k ...passed 00:09:04.168 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.168 Test: blockdev comparev and writev ...passed 00:09:04.168 Test: blockdev nvme passthru rw ...passed 00:09:04.168 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.168 Test: blockdev nvme admin passthru ...passed 00:09:04.168 Test: blockdev copy ...passed 00:09:04.168 Suite: bdevio tests on: raid1 00:09:04.168 Test: blockdev write read block ...passed 00:09:04.168 Test: blockdev write zeroes read block ...passed 00:09:04.168 Test: blockdev write zeroes read no split ...passed 00:09:04.168 Test: blockdev write zeroes read split ...passed 00:09:04.168 Test: blockdev write zeroes read split partial ...passed 00:09:04.168 Test: blockdev reset ...passed 00:09:04.168 Test: blockdev write read 8 blocks ...passed 00:09:04.168 Test: blockdev write read size > 128k ...passed 00:09:04.168 Test: blockdev write read invalid size ...passed 00:09:04.168 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.168 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.168 Test: blockdev write read max offset ...passed 00:09:04.168 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.168 Test: blockdev writev readv 8 blocks ...passed 00:09:04.168 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.168 Test: blockdev writev readv block ...passed 00:09:04.168 Test: blockdev writev readv size > 128k ...passed 00:09:04.168 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.168 Test: blockdev comparev and writev ...passed 00:09:04.168 Test: blockdev nvme passthru rw ...passed 00:09:04.168 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.168 Test: blockdev nvme admin passthru ...passed 00:09:04.168 Test: blockdev copy ...passed 00:09:04.168 Suite: bdevio tests on: concat0 00:09:04.168 Test: blockdev write read block ...passed 00:09:04.168 Test: blockdev write zeroes read block ...passed 00:09:04.168 Test: blockdev write zeroes read no split ...passed 00:09:04.168 Test: blockdev write zeroes read split ...passed 00:09:04.168 Test: blockdev write zeroes read split partial ...passed 00:09:04.168 Test: blockdev reset ...passed 00:09:04.168 Test: blockdev write read 8 blocks ...passed 00:09:04.168 Test: blockdev write read size > 128k ...passed 00:09:04.168 Test: blockdev write read invalid size ...passed 00:09:04.168 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.168 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.168 Test: blockdev write read max offset ...passed 00:09:04.168 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.168 Test: blockdev writev readv 8 blocks ...passed 00:09:04.168 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.168 Test: blockdev writev readv block ...passed 00:09:04.168 Test: blockdev writev readv size > 128k ...passed 00:09:04.168 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.168 Test: blockdev comparev and writev ...passed 00:09:04.168 Test: blockdev nvme passthru rw ...passed 00:09:04.168 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.168 Test: blockdev nvme admin passthru ...passed 00:09:04.168 Test: blockdev copy ...passed 00:09:04.168 Suite: bdevio tests on: raid0 00:09:04.168 Test: blockdev write read block ...passed 00:09:04.168 Test: blockdev write zeroes read block ...passed 00:09:04.168 Test: blockdev write zeroes read no split ...passed 00:09:04.168 Test: blockdev write zeroes read split ...passed 00:09:04.168 Test: blockdev write zeroes read split partial ...passed 00:09:04.168 Test: blockdev reset ...passed 00:09:04.168 Test: blockdev write read 8 blocks ...passed 00:09:04.168 Test: blockdev write read size > 128k ...passed 00:09:04.168 Test: blockdev write read invalid size ...passed 00:09:04.168 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.168 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.168 Test: blockdev write read max offset ...passed 00:09:04.168 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.168 Test: blockdev writev readv 8 blocks ...passed 00:09:04.168 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.168 Test: blockdev writev readv block ...passed 00:09:04.168 Test: blockdev writev readv size > 128k ...passed 00:09:04.168 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.168 Test: blockdev comparev and writev ...passed 00:09:04.168 Test: blockdev nvme passthru rw ...passed 00:09:04.168 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.168 Test: blockdev nvme admin passthru ...passed 00:09:04.168 Test: blockdev copy ...passed 00:09:04.168 Suite: bdevio tests on: TestPT 00:09:04.168 Test: blockdev write read block ...passed 00:09:04.168 Test: blockdev write zeroes read block ...passed 00:09:04.168 Test: blockdev write zeroes read no split ...passed 00:09:04.168 Test: blockdev write zeroes read split ...passed 00:09:04.428 Test: blockdev write zeroes read split partial ...passed 00:09:04.428 Test: blockdev reset ...passed 00:09:04.428 Test: blockdev write read 8 blocks ...passed 00:09:04.428 Test: blockdev write read size > 128k ...passed 00:09:04.428 Test: blockdev write read invalid size ...passed 00:09:04.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.428 Test: blockdev write read max offset ...passed 00:09:04.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.428 Test: blockdev writev readv 8 blocks ...passed 00:09:04.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.428 Test: blockdev writev readv block ...passed 00:09:04.428 Test: blockdev writev readv size > 128k ...passed 00:09:04.428 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.428 Test: blockdev comparev and writev ...passed 00:09:04.428 Test: blockdev nvme passthru rw ...passed 00:09:04.428 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.428 Test: blockdev nvme admin passthru ...passed 00:09:04.428 Test: blockdev copy ...passed 00:09:04.428 Suite: bdevio tests on: Malloc2p7 00:09:04.428 Test: blockdev write read block ...passed 00:09:04.428 Test: blockdev write zeroes read block ...passed 00:09:04.428 Test: blockdev write zeroes read no split ...passed 00:09:04.428 Test: blockdev write zeroes read split ...passed 00:09:04.428 Test: blockdev write zeroes read split partial ...passed 00:09:04.428 Test: blockdev reset ...passed 00:09:04.428 Test: blockdev write read 8 blocks ...passed 00:09:04.428 Test: blockdev write read size > 128k ...passed 00:09:04.428 Test: blockdev write read invalid size ...passed 00:09:04.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.428 Test: blockdev write read max offset ...passed 00:09:04.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.428 Test: blockdev writev readv 8 blocks ...passed 00:09:04.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.428 Test: blockdev writev readv block ...passed 00:09:04.428 Test: blockdev writev readv size > 128k ...passed 00:09:04.428 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.428 Test: blockdev comparev and writev ...passed 00:09:04.428 Test: blockdev nvme passthru rw ...passed 00:09:04.428 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.428 Test: blockdev nvme admin passthru ...passed 00:09:04.428 Test: blockdev copy ...passed 00:09:04.428 Suite: bdevio tests on: Malloc2p6 00:09:04.428 Test: blockdev write read block ...passed 00:09:04.428 Test: blockdev write zeroes read block ...passed 00:09:04.428 Test: blockdev write zeroes read no split ...passed 00:09:04.428 Test: blockdev write zeroes read split ...passed 00:09:04.428 Test: blockdev write zeroes read split partial ...passed 00:09:04.428 Test: blockdev reset ...passed 00:09:04.428 Test: blockdev write read 8 blocks ...passed 00:09:04.428 Test: blockdev write read size > 128k ...passed 00:09:04.428 Test: blockdev write read invalid size ...passed 00:09:04.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.428 Test: blockdev write read max offset ...passed 00:09:04.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.428 Test: blockdev writev readv 8 blocks ...passed 00:09:04.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.428 Test: blockdev writev readv block ...passed 00:09:04.428 Test: blockdev writev readv size > 128k ...passed 00:09:04.428 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.428 Test: blockdev comparev and writev ...passed 00:09:04.428 Test: blockdev nvme passthru rw ...passed 00:09:04.428 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.428 Test: blockdev nvme admin passthru ...passed 00:09:04.428 Test: blockdev copy ...passed 00:09:04.428 Suite: bdevio tests on: Malloc2p5 00:09:04.428 Test: blockdev write read block ...passed 00:09:04.428 Test: blockdev write zeroes read block ...passed 00:09:04.428 Test: blockdev write zeroes read no split ...passed 00:09:04.428 Test: blockdev write zeroes read split ...passed 00:09:04.428 Test: blockdev write zeroes read split partial ...passed 00:09:04.428 Test: blockdev reset ...passed 00:09:04.428 Test: blockdev write read 8 blocks ...passed 00:09:04.428 Test: blockdev write read size > 128k ...passed 00:09:04.428 Test: blockdev write read invalid size ...passed 00:09:04.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.428 Test: blockdev write read max offset ...passed 00:09:04.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.428 Test: blockdev writev readv 8 blocks ...passed 00:09:04.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.428 Test: blockdev writev readv block ...passed 00:09:04.428 Test: blockdev writev readv size > 128k ...passed 00:09:04.428 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.428 Test: blockdev comparev and writev ...passed 00:09:04.428 Test: blockdev nvme passthru rw ...passed 00:09:04.428 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.428 Test: blockdev nvme admin passthru ...passed 00:09:04.428 Test: blockdev copy ...passed 00:09:04.428 Suite: bdevio tests on: Malloc2p4 00:09:04.428 Test: blockdev write read block ...passed 00:09:04.428 Test: blockdev write zeroes read block ...passed 00:09:04.428 Test: blockdev write zeroes read no split ...passed 00:09:04.428 Test: blockdev write zeroes read split ...passed 00:09:04.428 Test: blockdev write zeroes read split partial ...passed 00:09:04.428 Test: blockdev reset ...passed 00:09:04.428 Test: blockdev write read 8 blocks ...passed 00:09:04.428 Test: blockdev write read size > 128k ...passed 00:09:04.428 Test: blockdev write read invalid size ...passed 00:09:04.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.428 Test: blockdev write read max offset ...passed 00:09:04.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.428 Test: blockdev writev readv 8 blocks ...passed 00:09:04.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.428 Test: blockdev writev readv block ...passed 00:09:04.428 Test: blockdev writev readv size > 128k ...passed 00:09:04.428 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.428 Test: blockdev comparev and writev ...passed 00:09:04.428 Test: blockdev nvme passthru rw ...passed 00:09:04.428 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.428 Test: blockdev nvme admin passthru ...passed 00:09:04.428 Test: blockdev copy ...passed 00:09:04.428 Suite: bdevio tests on: Malloc2p3 00:09:04.428 Test: blockdev write read block ...passed 00:09:04.428 Test: blockdev write zeroes read block ...passed 00:09:04.428 Test: blockdev write zeroes read no split ...passed 00:09:04.428 Test: blockdev write zeroes read split ...passed 00:09:04.428 Test: blockdev write zeroes read split partial ...passed 00:09:04.428 Test: blockdev reset ...passed 00:09:04.428 Test: blockdev write read 8 blocks ...passed 00:09:04.428 Test: blockdev write read size > 128k ...passed 00:09:04.428 Test: blockdev write read invalid size ...passed 00:09:04.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.428 Test: blockdev write read max offset ...passed 00:09:04.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.428 Test: blockdev writev readv 8 blocks ...passed 00:09:04.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.428 Test: blockdev writev readv block ...passed 00:09:04.428 Test: blockdev writev readv size > 128k ...passed 00:09:04.428 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.428 Test: blockdev comparev and writev ...passed 00:09:04.428 Test: blockdev nvme passthru rw ...passed 00:09:04.428 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.428 Test: blockdev nvme admin passthru ...passed 00:09:04.428 Test: blockdev copy ...passed 00:09:04.428 Suite: bdevio tests on: Malloc2p2 00:09:04.428 Test: blockdev write read block ...passed 00:09:04.428 Test: blockdev write zeroes read block ...passed 00:09:04.428 Test: blockdev write zeroes read no split ...passed 00:09:04.428 Test: blockdev write zeroes read split ...passed 00:09:04.428 Test: blockdev write zeroes read split partial ...passed 00:09:04.428 Test: blockdev reset ...passed 00:09:04.428 Test: blockdev write read 8 blocks ...passed 00:09:04.428 Test: blockdev write read size > 128k ...passed 00:09:04.428 Test: blockdev write read invalid size ...passed 00:09:04.428 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.428 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.428 Test: blockdev write read max offset ...passed 00:09:04.428 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.428 Test: blockdev writev readv 8 blocks ...passed 00:09:04.428 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.429 Test: blockdev writev readv block ...passed 00:09:04.429 Test: blockdev writev readv size > 128k ...passed 00:09:04.429 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.429 Test: blockdev comparev and writev ...passed 00:09:04.429 Test: blockdev nvme passthru rw ...passed 00:09:04.429 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.429 Test: blockdev nvme admin passthru ...passed 00:09:04.429 Test: blockdev copy ...passed 00:09:04.429 Suite: bdevio tests on: Malloc2p1 00:09:04.429 Test: blockdev write read block ...passed 00:09:04.429 Test: blockdev write zeroes read block ...passed 00:09:04.429 Test: blockdev write zeroes read no split ...passed 00:09:04.429 Test: blockdev write zeroes read split ...passed 00:09:04.429 Test: blockdev write zeroes read split partial ...passed 00:09:04.429 Test: blockdev reset ...passed 00:09:04.429 Test: blockdev write read 8 blocks ...passed 00:09:04.429 Test: blockdev write read size > 128k ...passed 00:09:04.429 Test: blockdev write read invalid size ...passed 00:09:04.429 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.429 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.429 Test: blockdev write read max offset ...passed 00:09:04.429 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.429 Test: blockdev writev readv 8 blocks ...passed 00:09:04.429 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.429 Test: blockdev writev readv block ...passed 00:09:04.429 Test: blockdev writev readv size > 128k ...passed 00:09:04.429 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.429 Test: blockdev comparev and writev ...passed 00:09:04.429 Test: blockdev nvme passthru rw ...passed 00:09:04.429 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.429 Test: blockdev nvme admin passthru ...passed 00:09:04.429 Test: blockdev copy ...passed 00:09:04.429 Suite: bdevio tests on: Malloc2p0 00:09:04.429 Test: blockdev write read block ...passed 00:09:04.429 Test: blockdev write zeroes read block ...passed 00:09:04.429 Test: blockdev write zeroes read no split ...passed 00:09:04.429 Test: blockdev write zeroes read split ...passed 00:09:04.429 Test: blockdev write zeroes read split partial ...passed 00:09:04.429 Test: blockdev reset ...passed 00:09:04.429 Test: blockdev write read 8 blocks ...passed 00:09:04.429 Test: blockdev write read size > 128k ...passed 00:09:04.429 Test: blockdev write read invalid size ...passed 00:09:04.429 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.429 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.429 Test: blockdev write read max offset ...passed 00:09:04.429 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.429 Test: blockdev writev readv 8 blocks ...passed 00:09:04.429 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.429 Test: blockdev writev readv block ...passed 00:09:04.429 Test: blockdev writev readv size > 128k ...passed 00:09:04.429 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.429 Test: blockdev comparev and writev ...passed 00:09:04.429 Test: blockdev nvme passthru rw ...passed 00:09:04.429 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.429 Test: blockdev nvme admin passthru ...passed 00:09:04.429 Test: blockdev copy ...passed 00:09:04.429 Suite: bdevio tests on: Malloc1p1 00:09:04.429 Test: blockdev write read block ...passed 00:09:04.429 Test: blockdev write zeroes read block ...passed 00:09:04.429 Test: blockdev write zeroes read no split ...passed 00:09:04.429 Test: blockdev write zeroes read split ...passed 00:09:04.429 Test: blockdev write zeroes read split partial ...passed 00:09:04.429 Test: blockdev reset ...passed 00:09:04.429 Test: blockdev write read 8 blocks ...passed 00:09:04.429 Test: blockdev write read size > 128k ...passed 00:09:04.429 Test: blockdev write read invalid size ...passed 00:09:04.429 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.429 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.429 Test: blockdev write read max offset ...passed 00:09:04.429 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.429 Test: blockdev writev readv 8 blocks ...passed 00:09:04.429 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.429 Test: blockdev writev readv block ...passed 00:09:04.429 Test: blockdev writev readv size > 128k ...passed 00:09:04.429 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.429 Test: blockdev comparev and writev ...passed 00:09:04.429 Test: blockdev nvme passthru rw ...passed 00:09:04.429 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.429 Test: blockdev nvme admin passthru ...passed 00:09:04.429 Test: blockdev copy ...passed 00:09:04.429 Suite: bdevio tests on: Malloc1p0 00:09:04.429 Test: blockdev write read block ...passed 00:09:04.429 Test: blockdev write zeroes read block ...passed 00:09:04.429 Test: blockdev write zeroes read no split ...passed 00:09:04.429 Test: blockdev write zeroes read split ...passed 00:09:04.429 Test: blockdev write zeroes read split partial ...passed 00:09:04.429 Test: blockdev reset ...passed 00:09:04.429 Test: blockdev write read 8 blocks ...passed 00:09:04.429 Test: blockdev write read size > 128k ...passed 00:09:04.429 Test: blockdev write read invalid size ...passed 00:09:04.429 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.429 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.429 Test: blockdev write read max offset ...passed 00:09:04.429 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.429 Test: blockdev writev readv 8 blocks ...passed 00:09:04.429 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.429 Test: blockdev writev readv block ...passed 00:09:04.429 Test: blockdev writev readv size > 128k ...passed 00:09:04.429 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.429 Test: blockdev comparev and writev ...passed 00:09:04.429 Test: blockdev nvme passthru rw ...passed 00:09:04.429 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.429 Test: blockdev nvme admin passthru ...passed 00:09:04.429 Test: blockdev copy ...passed 00:09:04.429 Suite: bdevio tests on: Malloc0 00:09:04.429 Test: blockdev write read block ...passed 00:09:04.429 Test: blockdev write zeroes read block ...passed 00:09:04.429 Test: blockdev write zeroes read no split ...passed 00:09:04.429 Test: blockdev write zeroes read split ...passed 00:09:04.429 Test: blockdev write zeroes read split partial ...passed 00:09:04.429 Test: blockdev reset ...passed 00:09:04.429 Test: blockdev write read 8 blocks ...passed 00:09:04.429 Test: blockdev write read size > 128k ...passed 00:09:04.429 Test: blockdev write read invalid size ...passed 00:09:04.429 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:04.429 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:04.429 Test: blockdev write read max offset ...passed 00:09:04.429 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:04.429 Test: blockdev writev readv 8 blocks ...passed 00:09:04.429 Test: blockdev writev readv 30 x 1block ...passed 00:09:04.429 Test: blockdev writev readv block ...passed 00:09:04.429 Test: blockdev writev readv size > 128k ...passed 00:09:04.429 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:04.429 Test: blockdev comparev and writev ...passed 00:09:04.429 Test: blockdev nvme passthru rw ...passed 00:09:04.429 Test: blockdev nvme passthru vendor specific ...passed 00:09:04.429 Test: blockdev nvme admin passthru ...passed 00:09:04.429 Test: blockdev copy ...passed 00:09:04.429 00:09:04.429 Run Summary: Type Total Ran Passed Failed Inactive 00:09:04.429 suites 16 16 n/a 0 0 00:09:04.429 tests 368 368 368 0 0 00:09:04.429 asserts 2224 2224 2224 0 n/a 00:09:04.429 00:09:04.429 Elapsed time = 0.663 seconds 00:09:04.429 0 00:09:04.429 22:37:49 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2678301 00:09:04.429 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2678301 ']' 00:09:04.429 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2678301 00:09:04.429 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:09:04.429 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:04.429 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2678301 00:09:04.687 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:04.687 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:04.687 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2678301' 00:09:04.687 killing process with pid 2678301 00:09:04.687 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2678301 00:09:04.687 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2678301 00:09:04.947 22:37:49 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:09:04.947 00:09:04.947 real 0m1.566s 00:09:04.947 user 0m3.678s 00:09:04.947 sys 0m0.486s 00:09:04.947 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.947 22:37:49 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:04.947 ************************************ 00:09:04.947 END TEST bdev_bounds 00:09:04.947 ************************************ 00:09:04.947 22:37:49 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:04.947 22:37:49 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:04.947 22:37:49 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:04.947 22:37:49 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.947 22:37:49 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:04.947 ************************************ 00:09:04.947 START TEST bdev_nbd 00:09:04.947 ************************************ 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2678649 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2678649 /var/tmp/spdk-nbd.sock 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2678649 ']' 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:04.947 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:04.947 22:37:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:04.947 [2024-07-15 22:37:49.855090] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:09:04.947 [2024-07-15 22:37:49.855166] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:05.207 [2024-07-15 22:37:49.987153] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.207 [2024-07-15 22:37:50.102573] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:05.466 [2024-07-15 22:37:50.255441] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.466 [2024-07-15 22:37:50.255503] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:05.466 [2024-07-15 22:37:50.255517] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:05.466 [2024-07-15 22:37:50.263446] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.466 [2024-07-15 22:37:50.263477] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:05.466 [2024-07-15 22:37:50.271456] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.466 [2024-07-15 22:37:50.271482] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:05.466 [2024-07-15 22:37:50.348843] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:05.466 [2024-07-15 22:37:50.348894] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:05.466 [2024-07-15 22:37:50.348911] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ebea40 00:09:05.466 [2024-07-15 22:37:50.348924] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:05.466 [2024-07-15 22:37:50.350375] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:05.466 [2024-07-15 22:37:50.350406] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.035 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:09:06.294 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:09:06.294 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:09:06.294 22:37:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:06.294 1+0 records in 00:09:06.294 1+0 records out 00:09:06.294 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278756 s, 14.7 MB/s 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.294 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:06.553 1+0 records in 00:09:06.553 1+0 records out 00:09:06.553 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031099 s, 13.2 MB/s 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.553 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:06.812 1+0 records in 00:09:06.812 1+0 records out 00:09:06.812 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292966 s, 14.0 MB/s 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.812 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.071 1+0 records in 00:09:07.071 1+0 records out 00:09:07.071 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289095 s, 14.2 MB/s 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.071 22:37:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.330 1+0 records in 00:09:07.330 1+0 records out 00:09:07.330 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000353678 s, 11.6 MB/s 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.330 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.590 1+0 records in 00:09:07.590 1+0 records out 00:09:07.590 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000344778 s, 11.9 MB/s 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.590 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.850 1+0 records in 00:09:07.850 1+0 records out 00:09:07.850 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000455468 s, 9.0 MB/s 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.850 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.110 1+0 records in 00:09:08.110 1+0 records out 00:09:08.110 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000506876 s, 8.1 MB/s 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.110 22:37:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.369 1+0 records in 00:09:08.369 1+0 records out 00:09:08.369 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000523355 s, 7.8 MB/s 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.369 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.629 1+0 records in 00:09:08.629 1+0 records out 00:09:08.629 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000480317 s, 8.5 MB/s 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.629 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:08.886 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.161 1+0 records in 00:09:09.161 1+0 records out 00:09:09.161 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00046679 s, 8.8 MB/s 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.161 22:37:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:09:09.161 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:09:09.161 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.420 1+0 records in 00:09:09.420 1+0 records out 00:09:09.420 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000657199 s, 6.2 MB/s 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.420 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.680 1+0 records in 00:09:09.680 1+0 records out 00:09:09.680 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000492351 s, 8.3 MB/s 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.680 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.939 1+0 records in 00:09:09.939 1+0 records out 00:09:09.939 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000668959 s, 6.1 MB/s 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.939 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.236 1+0 records in 00:09:10.236 1+0 records out 00:09:10.236 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000836607 s, 4.9 MB/s 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.236 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.237 22:37:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.497 1+0 records in 00:09:10.497 1+0 records out 00:09:10.497 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000571399 s, 7.2 MB/s 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.497 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd0", 00:09:10.756 "bdev_name": "Malloc0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd1", 00:09:10.756 "bdev_name": "Malloc1p0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd2", 00:09:10.756 "bdev_name": "Malloc1p1" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd3", 00:09:10.756 "bdev_name": "Malloc2p0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd4", 00:09:10.756 "bdev_name": "Malloc2p1" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd5", 00:09:10.756 "bdev_name": "Malloc2p2" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd6", 00:09:10.756 "bdev_name": "Malloc2p3" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd7", 00:09:10.756 "bdev_name": "Malloc2p4" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd8", 00:09:10.756 "bdev_name": "Malloc2p5" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd9", 00:09:10.756 "bdev_name": "Malloc2p6" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd10", 00:09:10.756 "bdev_name": "Malloc2p7" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd11", 00:09:10.756 "bdev_name": "TestPT" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd12", 00:09:10.756 "bdev_name": "raid0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd13", 00:09:10.756 "bdev_name": "concat0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd14", 00:09:10.756 "bdev_name": "raid1" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd15", 00:09:10.756 "bdev_name": "AIO0" 00:09:10.756 } 00:09:10.756 ]' 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd0", 00:09:10.756 "bdev_name": "Malloc0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd1", 00:09:10.756 "bdev_name": "Malloc1p0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd2", 00:09:10.756 "bdev_name": "Malloc1p1" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd3", 00:09:10.756 "bdev_name": "Malloc2p0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd4", 00:09:10.756 "bdev_name": "Malloc2p1" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd5", 00:09:10.756 "bdev_name": "Malloc2p2" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd6", 00:09:10.756 "bdev_name": "Malloc2p3" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd7", 00:09:10.756 "bdev_name": "Malloc2p4" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd8", 00:09:10.756 "bdev_name": "Malloc2p5" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd9", 00:09:10.756 "bdev_name": "Malloc2p6" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd10", 00:09:10.756 "bdev_name": "Malloc2p7" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd11", 00:09:10.756 "bdev_name": "TestPT" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd12", 00:09:10.756 "bdev_name": "raid0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd13", 00:09:10.756 "bdev_name": "concat0" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd14", 00:09:10.756 "bdev_name": "raid1" 00:09:10.756 }, 00:09:10.756 { 00:09:10.756 "nbd_device": "/dev/nbd15", 00:09:10.756 "bdev_name": "AIO0" 00:09:10.756 } 00:09:10.756 ]' 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:09:10.756 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:10.757 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:10.757 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:10.757 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.014 22:37:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.272 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.530 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.788 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.047 22:37:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.306 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.874 22:37:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:13.133 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:13.391 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:13.391 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:13.391 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.391 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.391 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:13.392 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.392 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.392 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.392 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.650 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.909 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.168 22:37:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.427 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.686 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.945 22:37:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:15.204 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:15.464 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:15.723 /dev/nbd0 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:15.982 1+0 records in 00:09:15.982 1+0 records out 00:09:15.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281196 s, 14.6 MB/s 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:09:15.982 /dev/nbd1 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:15.982 1+0 records in 00:09:15.982 1+0 records out 00:09:15.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304775 s, 13.4 MB/s 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:15.982 22:38:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:09:16.241 /dev/nbd10 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:16.241 1+0 records in 00:09:16.241 1+0 records out 00:09:16.241 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336493 s, 12.2 MB/s 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.241 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:09:16.500 /dev/nbd11 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:16.500 1+0 records in 00:09:16.500 1+0 records out 00:09:16.500 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000318415 s, 12.9 MB/s 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.500 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:16.759 /dev/nbd12 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:16.759 1+0 records in 00:09:16.759 1+0 records out 00:09:16.759 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000407969 s, 10.0 MB/s 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.759 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:17.017 /dev/nbd13 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.276 1+0 records in 00:09:17.276 1+0 records out 00:09:17.276 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300955 s, 13.6 MB/s 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.276 22:38:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:17.276 /dev/nbd14 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.276 1+0 records in 00:09:17.276 1+0 records out 00:09:17.276 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000394045 s, 10.4 MB/s 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.276 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:17.535 /dev/nbd15 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.535 1+0 records in 00:09:17.535 1+0 records out 00:09:17.535 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000549126 s, 7.5 MB/s 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.535 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:17.536 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.536 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:17.536 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:17.536 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.536 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.536 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:17.796 /dev/nbd2 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.796 1+0 records in 00:09:17.796 1+0 records out 00:09:17.796 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000519515 s, 7.9 MB/s 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.796 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:18.054 /dev/nbd3 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.313 1+0 records in 00:09:18.313 1+0 records out 00:09:18.313 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000485809 s, 8.4 MB/s 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.313 22:38:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:18.572 /dev/nbd4 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.572 1+0 records in 00:09:18.572 1+0 records out 00:09:18.572 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000524149 s, 7.8 MB/s 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.572 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:18.830 /dev/nbd5 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.830 1+0 records in 00:09:18.830 1+0 records out 00:09:18.830 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000548877 s, 7.5 MB/s 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.830 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:19.089 /dev/nbd6 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.089 1+0 records in 00:09:19.089 1+0 records out 00:09:19.089 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000708558 s, 5.8 MB/s 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.089 22:38:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:19.349 /dev/nbd7 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.349 1+0 records in 00:09:19.349 1+0 records out 00:09:19.349 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000579642 s, 7.1 MB/s 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.349 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:19.609 /dev/nbd8 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.609 1+0 records in 00:09:19.609 1+0 records out 00:09:19.609 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000793728 s, 5.2 MB/s 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.609 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:19.868 /dev/nbd9 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:19.868 1+0 records in 00:09:19.868 1+0 records out 00:09:19.868 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000607067 s, 6.7 MB/s 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:19.868 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:20.127 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd0", 00:09:20.127 "bdev_name": "Malloc0" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd1", 00:09:20.127 "bdev_name": "Malloc1p0" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd10", 00:09:20.127 "bdev_name": "Malloc1p1" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd11", 00:09:20.127 "bdev_name": "Malloc2p0" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd12", 00:09:20.127 "bdev_name": "Malloc2p1" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd13", 00:09:20.127 "bdev_name": "Malloc2p2" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd14", 00:09:20.127 "bdev_name": "Malloc2p3" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd15", 00:09:20.127 "bdev_name": "Malloc2p4" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd2", 00:09:20.127 "bdev_name": "Malloc2p5" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd3", 00:09:20.127 "bdev_name": "Malloc2p6" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd4", 00:09:20.127 "bdev_name": "Malloc2p7" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd5", 00:09:20.127 "bdev_name": "TestPT" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd6", 00:09:20.127 "bdev_name": "raid0" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd7", 00:09:20.127 "bdev_name": "concat0" 00:09:20.127 }, 00:09:20.127 { 00:09:20.127 "nbd_device": "/dev/nbd8", 00:09:20.127 "bdev_name": "raid1" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd9", 00:09:20.128 "bdev_name": "AIO0" 00:09:20.128 } 00:09:20.128 ]' 00:09:20.128 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd0", 00:09:20.128 "bdev_name": "Malloc0" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd1", 00:09:20.128 "bdev_name": "Malloc1p0" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd10", 00:09:20.128 "bdev_name": "Malloc1p1" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd11", 00:09:20.128 "bdev_name": "Malloc2p0" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd12", 00:09:20.128 "bdev_name": "Malloc2p1" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd13", 00:09:20.128 "bdev_name": "Malloc2p2" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd14", 00:09:20.128 "bdev_name": "Malloc2p3" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd15", 00:09:20.128 "bdev_name": "Malloc2p4" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd2", 00:09:20.128 "bdev_name": "Malloc2p5" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd3", 00:09:20.128 "bdev_name": "Malloc2p6" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd4", 00:09:20.128 "bdev_name": "Malloc2p7" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd5", 00:09:20.128 "bdev_name": "TestPT" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd6", 00:09:20.128 "bdev_name": "raid0" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd7", 00:09:20.128 "bdev_name": "concat0" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd8", 00:09:20.128 "bdev_name": "raid1" 00:09:20.128 }, 00:09:20.128 { 00:09:20.128 "nbd_device": "/dev/nbd9", 00:09:20.128 "bdev_name": "AIO0" 00:09:20.128 } 00:09:20.128 ]' 00:09:20.128 22:38:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:20.128 /dev/nbd1 00:09:20.128 /dev/nbd10 00:09:20.128 /dev/nbd11 00:09:20.128 /dev/nbd12 00:09:20.128 /dev/nbd13 00:09:20.128 /dev/nbd14 00:09:20.128 /dev/nbd15 00:09:20.128 /dev/nbd2 00:09:20.128 /dev/nbd3 00:09:20.128 /dev/nbd4 00:09:20.128 /dev/nbd5 00:09:20.128 /dev/nbd6 00:09:20.128 /dev/nbd7 00:09:20.128 /dev/nbd8 00:09:20.128 /dev/nbd9' 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:20.128 /dev/nbd1 00:09:20.128 /dev/nbd10 00:09:20.128 /dev/nbd11 00:09:20.128 /dev/nbd12 00:09:20.128 /dev/nbd13 00:09:20.128 /dev/nbd14 00:09:20.128 /dev/nbd15 00:09:20.128 /dev/nbd2 00:09:20.128 /dev/nbd3 00:09:20.128 /dev/nbd4 00:09:20.128 /dev/nbd5 00:09:20.128 /dev/nbd6 00:09:20.128 /dev/nbd7 00:09:20.128 /dev/nbd8 00:09:20.128 /dev/nbd9' 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:20.128 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:20.387 256+0 records in 00:09:20.387 256+0 records out 00:09:20.387 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010277 s, 102 MB/s 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:20.387 256+0 records in 00:09:20.387 256+0 records out 00:09:20.387 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181886 s, 5.8 MB/s 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.387 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:20.646 256+0 records in 00:09:20.646 256+0 records out 00:09:20.647 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18438 s, 5.7 MB/s 00:09:20.647 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.647 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:20.905 256+0 records in 00:09:20.905 256+0 records out 00:09:20.905 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184204 s, 5.7 MB/s 00:09:20.905 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.905 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:20.905 256+0 records in 00:09:20.905 256+0 records out 00:09:20.905 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184267 s, 5.7 MB/s 00:09:20.905 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.905 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:21.164 256+0 records in 00:09:21.164 256+0 records out 00:09:21.164 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184063 s, 5.7 MB/s 00:09:21.164 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.164 22:38:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:21.421 256+0 records in 00:09:21.421 256+0 records out 00:09:21.421 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18398 s, 5.7 MB/s 00:09:21.421 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.421 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:21.679 256+0 records in 00:09:21.679 256+0 records out 00:09:21.679 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184078 s, 5.7 MB/s 00:09:21.679 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.679 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:21.679 256+0 records in 00:09:21.679 256+0 records out 00:09:21.679 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18414 s, 5.7 MB/s 00:09:21.679 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.679 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:21.938 256+0 records in 00:09:21.938 256+0 records out 00:09:21.938 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183822 s, 5.7 MB/s 00:09:21.938 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.938 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:22.197 256+0 records in 00:09:22.197 256+0 records out 00:09:22.197 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.18378 s, 5.7 MB/s 00:09:22.197 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.197 22:38:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:22.455 256+0 records in 00:09:22.455 256+0 records out 00:09:22.455 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183699 s, 5.7 MB/s 00:09:22.455 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.455 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:22.455 256+0 records in 00:09:22.455 256+0 records out 00:09:22.455 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184134 s, 5.7 MB/s 00:09:22.455 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.455 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:22.713 256+0 records in 00:09:22.713 256+0 records out 00:09:22.713 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185086 s, 5.7 MB/s 00:09:22.713 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.713 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:22.971 256+0 records in 00:09:22.971 256+0 records out 00:09:22.971 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.185419 s, 5.7 MB/s 00:09:22.971 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:22.971 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:23.229 256+0 records in 00:09:23.229 256+0 records out 00:09:23.229 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188082 s, 5.6 MB/s 00:09:23.229 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:07 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:23.229 256+0 records in 00:09:23.229 256+0 records out 00:09:23.229 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182532 s, 5.7 MB/s 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.229 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.487 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.488 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.746 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:24.003 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.004 22:38:08 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.261 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.262 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.262 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.549 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.808 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.067 22:38:09 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.326 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.584 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:25.866 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.125 22:38:10 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.384 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.643 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.900 22:38:11 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.158 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:27.724 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:27.981 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:27.981 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:27.981 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:28.239 22:38:12 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:28.497 malloc_lvol_verify 00:09:28.497 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:28.755 744fd138-0451-4742-8040-6e96aa582644 00:09:28.755 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:28.755 c77a3c9b-9f65-457f-87b0-c1856d049b33 00:09:29.013 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:29.013 /dev/nbd0 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:29.272 mke2fs 1.46.5 (30-Dec-2021) 00:09:29.272 Discarding device blocks: 0/4096 done 00:09:29.272 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:29.272 00:09:29.272 Allocating group tables: 0/1 done 00:09:29.272 Writing inode tables: 0/1 done 00:09:29.272 Creating journal (1024 blocks): done 00:09:29.272 Writing superblocks and filesystem accounting information: 0/1 done 00:09:29.272 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:29.272 22:38:13 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2678649 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2678649 ']' 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2678649 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2678649 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2678649' 00:09:29.530 killing process with pid 2678649 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2678649 00:09:29.530 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2678649 00:09:29.790 22:38:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:09:29.790 00:09:29.790 real 0m24.837s 00:09:29.790 user 0m30.448s 00:09:29.790 sys 0m14.692s 00:09:29.790 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:29.790 22:38:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:29.790 ************************************ 00:09:29.790 END TEST bdev_nbd 00:09:29.790 ************************************ 00:09:29.790 22:38:14 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:29.790 22:38:14 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:09:29.790 22:38:14 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:09:29.790 22:38:14 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:09:29.790 22:38:14 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:09:29.790 22:38:14 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:29.790 22:38:14 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:29.790 22:38:14 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:30.049 ************************************ 00:09:30.049 START TEST bdev_fio 00:09:30.049 ************************************ 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:30.049 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:30.049 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:30.050 22:38:14 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:30.050 ************************************ 00:09:30.050 START TEST bdev_fio_rw_verify 00:09:30.050 ************************************ 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:30.050 22:38:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:30.616 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:30.616 fio-3.35 00:09:30.616 Starting 16 threads 00:09:42.825 00:09:42.825 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2682664: Mon Jul 15 22:38:26 2024 00:09:42.825 read: IOPS=80.5k, BW=315MiB/s (330MB/s)(3145MiB/10001msec) 00:09:42.825 slat (usec): min=2, max=56820, avg=39.76, stdev=65.66 00:09:42.825 clat (usec): min=11, max=57219, avg=323.42, stdev=221.32 00:09:42.825 lat (usec): min=24, max=57268, avg=363.18, stdev=237.70 00:09:42.825 clat percentiles (usec): 00:09:42.825 | 50.000th=[ 310], 99.000th=[ 693], 99.900th=[ 807], 99.990th=[ 1139], 00:09:42.825 | 99.999th=[ 2769] 00:09:42.825 write: IOPS=126k, BW=491MiB/s (515MB/s)(4837MiB/9849msec); 0 zone resets 00:09:42.825 slat (usec): min=7, max=790, avg=55.08, stdev=20.36 00:09:42.825 clat (usec): min=11, max=3790, avg=385.30, stdev=191.83 00:09:42.825 lat (usec): min=37, max=3825, avg=440.38, stdev=204.35 00:09:42.825 clat percentiles (usec): 00:09:42.825 | 50.000th=[ 359], 99.000th=[ 922], 99.900th=[ 1287], 99.990th=[ 1418], 00:09:42.825 | 99.999th=[ 1549] 00:09:42.825 bw ( KiB/s): min=375128, max=724915, per=98.75%, avg=496594.26, stdev=5923.51, samples=304 00:09:42.825 iops : min=93782, max=181224, avg=124148.32, stdev=1480.84, samples=304 00:09:42.825 lat (usec) : 20=0.01%, 50=0.43%, 100=3.81%, 250=26.80%, 500=46.86% 00:09:42.825 lat (usec) : 750=19.66%, 1000=2.11% 00:09:42.825 lat (msec) : 2=0.32%, 4=0.01%, 100=0.01% 00:09:42.825 cpu : usr=99.15%, sys=0.37%, ctx=665, majf=0, minf=3419 00:09:42.825 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:42.825 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:42.825 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:42.825 issued rwts: total=805236,1238229,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:42.825 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:42.825 00:09:42.825 Run status group 0 (all jobs): 00:09:42.825 READ: bw=315MiB/s (330MB/s), 315MiB/s-315MiB/s (330MB/s-330MB/s), io=3145MiB (3298MB), run=10001-10001msec 00:09:42.825 WRITE: bw=491MiB/s (515MB/s), 491MiB/s-491MiB/s (515MB/s-515MB/s), io=4837MiB (5072MB), run=9849-9849msec 00:09:42.825 00:09:42.825 real 0m11.722s 00:09:42.825 user 2m45.065s 00:09:42.825 sys 0m1.615s 00:09:42.825 22:38:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:42.825 22:38:26 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:42.825 ************************************ 00:09:42.825 END TEST bdev_fio_rw_verify 00:09:42.825 ************************************ 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:42.825 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:42.826 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "543f3082-5cd5-4b98-8b6c-5d095d9e3758"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "543f3082-5cd5-4b98-8b6c-5d095d9e3758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "70764dc3-7b4f-5bb1-89ea-a634ff063d61"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "70764dc3-7b4f-5bb1-89ea-a634ff063d61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "6927536a-d231-5a65-92a8-5204d93cabcb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6927536a-d231-5a65-92a8-5204d93cabcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "7e1fe307-5b41-5406-b965-29cfe4d48a2b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7e1fe307-5b41-5406-b965-29cfe4d48a2b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "b4fd3a36-da61-57e1-b2a8-1243a4218e3b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b4fd3a36-da61-57e1-b2a8-1243a4218e3b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "afabbe89-bc87-59ab-8201-ef979863ab26"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "afabbe89-bc87-59ab-8201-ef979863ab26",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ea0c27f7-c2be-5d23-8ad3-87ec4ccf3f53"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ea0c27f7-c2be-5d23-8ad3-87ec4ccf3f53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "351128e7-b0e4-59a2-89b5-ea7cd0d20cfe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "351128e7-b0e4-59a2-89b5-ea7cd0d20cfe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "fb9edc61-11c1-5e8d-901e-379d01920b03"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fb9edc61-11c1-5e8d-901e-379d01920b03",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "a107930e-ef5b-5861-b75a-0fe27d1226c8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a107930e-ef5b-5861-b75a-0fe27d1226c8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "85ebfcb3-e4d9-594d-929f-da5fa32367f8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "85ebfcb3-e4d9-594d-929f-da5fa32367f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "09d72fc6-6518-5e3b-bd48-b72dc8ade0e3"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "09d72fc6-6518-5e3b-bd48-b72dc8ade0e3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "880aa874-df25-4892-98d1-76ab48417e59"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "880aa874-df25-4892-98d1-76ab48417e59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "880aa874-df25-4892-98d1-76ab48417e59",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4b331d25-7e68-4fcf-bdc4-16f072e5ef88",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d5affa4b-0bdf-46bb-b830-9bf57f54a9af",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "95b0c80b-1380-490d-a65a-d7a8f4841174"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "95b0c80b-1380-490d-a65a-d7a8f4841174",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "95b0c80b-1380-490d-a65a-d7a8f4841174",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "e536e026-72f2-4005-8950-c4a1a3ae840f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6bad086f-72e1-4a3d-9e0a-986c6b8670c1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "205d68f5-e544-49ed-8771-fdc304e600ba"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "205d68f5-e544-49ed-8771-fdc304e600ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "205d68f5-e544-49ed-8771-fdc304e600ba",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "172ff4d5-43ab-4167-be90-5f71c758713b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "b7cdca5c-c5e6-4077-9283-a24ff05f66e5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "1e2758cf-cb5c-4c54-98f4-2720fcb49b24"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "1e2758cf-cb5c-4c54-98f4-2720fcb49b24",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:42.826 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:09:42.826 Malloc1p0 00:09:42.826 Malloc1p1 00:09:42.826 Malloc2p0 00:09:42.826 Malloc2p1 00:09:42.826 Malloc2p2 00:09:42.826 Malloc2p3 00:09:42.826 Malloc2p4 00:09:42.826 Malloc2p5 00:09:42.826 Malloc2p6 00:09:42.826 Malloc2p7 00:09:42.826 TestPT 00:09:42.826 raid0 00:09:42.826 concat0 ]] 00:09:42.826 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "543f3082-5cd5-4b98-8b6c-5d095d9e3758"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "543f3082-5cd5-4b98-8b6c-5d095d9e3758",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "70764dc3-7b4f-5bb1-89ea-a634ff063d61"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "70764dc3-7b4f-5bb1-89ea-a634ff063d61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "6927536a-d231-5a65-92a8-5204d93cabcb"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6927536a-d231-5a65-92a8-5204d93cabcb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "7e1fe307-5b41-5406-b965-29cfe4d48a2b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7e1fe307-5b41-5406-b965-29cfe4d48a2b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "b4fd3a36-da61-57e1-b2a8-1243a4218e3b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b4fd3a36-da61-57e1-b2a8-1243a4218e3b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "afabbe89-bc87-59ab-8201-ef979863ab26"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "afabbe89-bc87-59ab-8201-ef979863ab26",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ea0c27f7-c2be-5d23-8ad3-87ec4ccf3f53"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ea0c27f7-c2be-5d23-8ad3-87ec4ccf3f53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "351128e7-b0e4-59a2-89b5-ea7cd0d20cfe"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "351128e7-b0e4-59a2-89b5-ea7cd0d20cfe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "fb9edc61-11c1-5e8d-901e-379d01920b03"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fb9edc61-11c1-5e8d-901e-379d01920b03",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "a107930e-ef5b-5861-b75a-0fe27d1226c8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "a107930e-ef5b-5861-b75a-0fe27d1226c8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "85ebfcb3-e4d9-594d-929f-da5fa32367f8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "85ebfcb3-e4d9-594d-929f-da5fa32367f8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "09d72fc6-6518-5e3b-bd48-b72dc8ade0e3"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "09d72fc6-6518-5e3b-bd48-b72dc8ade0e3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "880aa874-df25-4892-98d1-76ab48417e59"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "880aa874-df25-4892-98d1-76ab48417e59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "880aa874-df25-4892-98d1-76ab48417e59",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4b331d25-7e68-4fcf-bdc4-16f072e5ef88",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "d5affa4b-0bdf-46bb-b830-9bf57f54a9af",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "95b0c80b-1380-490d-a65a-d7a8f4841174"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "95b0c80b-1380-490d-a65a-d7a8f4841174",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "95b0c80b-1380-490d-a65a-d7a8f4841174",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "e536e026-72f2-4005-8950-c4a1a3ae840f",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "6bad086f-72e1-4a3d-9e0a-986c6b8670c1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "205d68f5-e544-49ed-8771-fdc304e600ba"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "205d68f5-e544-49ed-8771-fdc304e600ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "205d68f5-e544-49ed-8771-fdc304e600ba",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "172ff4d5-43ab-4167-be90-5f71c758713b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "b7cdca5c-c5e6-4077-9283-a24ff05f66e5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "1e2758cf-cb5c-4c54-98f4-2720fcb49b24"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "1e2758cf-cb5c-4c54-98f4-2720fcb49b24",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:42.828 22:38:26 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:42.828 ************************************ 00:09:42.828 START TEST bdev_fio_trim 00:09:42.828 ************************************ 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:42.828 22:38:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:42.828 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:42.828 fio-3.35 00:09:42.828 Starting 14 threads 00:09:55.029 00:09:55.029 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2684368: Mon Jul 15 22:38:38 2024 00:09:55.029 write: IOPS=122k, BW=476MiB/s (499MB/s)(4764MiB/10001msec); 0 zone resets 00:09:55.029 slat (usec): min=2, max=3437, avg=40.98, stdev=10.87 00:09:55.029 clat (usec): min=31, max=1476, avg=287.69, stdev=94.05 00:09:55.029 lat (usec): min=44, max=3759, avg=328.67, stdev=97.45 00:09:55.029 clat percentiles (usec): 00:09:55.029 | 50.000th=[ 281], 99.000th=[ 494], 99.900th=[ 553], 99.990th=[ 635], 00:09:55.029 | 99.999th=[ 996] 00:09:55.029 bw ( KiB/s): min=446894, max=580460, per=100.00%, avg=488101.95, stdev=2321.98, samples=266 00:09:55.029 iops : min=111722, max=145113, avg=122025.26, stdev=580.49, samples=266 00:09:55.029 trim: IOPS=122k, BW=476MiB/s (499MB/s)(4764MiB/10001msec); 0 zone resets 00:09:55.029 slat (usec): min=4, max=148, avg=27.24, stdev= 6.86 00:09:55.029 clat (usec): min=4, max=3759, avg=324.12, stdev=102.61 00:09:55.029 lat (usec): min=16, max=3790, avg=351.37, stdev=105.27 00:09:55.029 clat percentiles (usec): 00:09:55.029 | 50.000th=[ 318], 99.000th=[ 545], 99.900th=[ 603], 99.990th=[ 693], 00:09:55.029 | 99.999th=[ 938] 00:09:55.029 bw ( KiB/s): min=446894, max=580468, per=100.00%, avg=488102.37, stdev=2322.10, samples=266 00:09:55.029 iops : min=111722, max=145115, avg=122025.37, stdev=580.52, samples=266 00:09:55.029 lat (usec) : 10=0.01%, 20=0.01%, 50=0.03%, 100=0.65%, 250=32.23% 00:09:55.029 lat (usec) : 500=64.64%, 750=2.44%, 1000=0.01% 00:09:55.029 lat (msec) : 2=0.01%, 4=0.01% 00:09:55.029 cpu : usr=99.59%, sys=0.01%, ctx=543, majf=0, minf=958 00:09:55.029 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:55.029 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:55.029 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:55.029 issued rwts: total=0,1219498,1219502,0 short=0,0,0,0 dropped=0,0,0,0 00:09:55.029 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:55.029 00:09:55.029 Run status group 0 (all jobs): 00:09:55.029 WRITE: bw=476MiB/s (499MB/s), 476MiB/s-476MiB/s (499MB/s-499MB/s), io=4764MiB (4995MB), run=10001-10001msec 00:09:55.029 TRIM: bw=476MiB/s (499MB/s), 476MiB/s-476MiB/s (499MB/s-499MB/s), io=4764MiB (4995MB), run=10001-10001msec 00:09:55.029 00:09:55.029 real 0m11.524s 00:09:55.029 user 2m25.866s 00:09:55.029 sys 0m0.840s 00:09:55.029 22:38:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.029 22:38:38 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:55.029 ************************************ 00:09:55.029 END TEST bdev_fio_trim 00:09:55.029 ************************************ 00:09:55.029 22:38:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:09:55.029 22:38:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:09:55.029 22:38:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:55.029 22:38:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:09:55.029 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:55.029 22:38:38 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:09:55.029 00:09:55.029 real 0m23.660s 00:09:55.029 user 5m11.169s 00:09:55.029 sys 0m2.661s 00:09:55.029 22:38:38 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.029 22:38:38 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:55.029 ************************************ 00:09:55.029 END TEST bdev_fio 00:09:55.029 ************************************ 00:09:55.029 22:38:38 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:55.029 22:38:38 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:55.029 22:38:38 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:55.029 22:38:38 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:09:55.029 22:38:38 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.029 22:38:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:55.029 ************************************ 00:09:55.029 START TEST bdev_verify 00:09:55.029 ************************************ 00:09:55.029 22:38:38 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:55.029 [2024-07-15 22:38:38.515009] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:09:55.029 [2024-07-15 22:38:38.515093] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2685813 ] 00:09:55.029 [2024-07-15 22:38:38.661306] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:55.029 [2024-07-15 22:38:38.767825] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:55.029 [2024-07-15 22:38:38.767830] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.029 [2024-07-15 22:38:38.920406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:55.030 [2024-07-15 22:38:38.920468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:55.030 [2024-07-15 22:38:38.920483] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:55.030 [2024-07-15 22:38:38.928410] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:55.030 [2024-07-15 22:38:38.928438] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:55.030 [2024-07-15 22:38:38.936421] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:55.030 [2024-07-15 22:38:38.936445] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:55.030 [2024-07-15 22:38:39.013538] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:55.030 [2024-07-15 22:38:39.013592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:55.030 [2024-07-15 22:38:39.013613] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17384d0 00:09:55.030 [2024-07-15 22:38:39.013625] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:55.030 [2024-07-15 22:38:39.015270] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:55.030 [2024-07-15 22:38:39.015300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:55.030 Running I/O for 5 seconds... 00:10:00.333 00:10:00.333 Latency(us) 00:10:00.333 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:00.333 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.333 Verification LBA range: start 0x0 length 0x1000 00:10:00.333 Malloc0 : 5.19 1183.15 4.62 0.00 0.00 107974.33 765.77 353780.42 00:10:00.333 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.333 Verification LBA range: start 0x1000 length 0x1000 00:10:00.333 Malloc0 : 5.17 989.72 3.87 0.00 0.00 129053.65 733.72 401194.30 00:10:00.333 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.333 Verification LBA range: start 0x0 length 0x800 00:10:00.333 Malloc1p0 : 5.20 615.74 2.41 0.00 0.00 206908.67 2678.43 177802.02 00:10:00.333 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.333 Verification LBA range: start 0x800 length 0x800 00:10:00.333 Malloc1p0 : 5.18 519.30 2.03 0.00 0.00 245193.21 3348.03 214274.23 00:10:00.333 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.333 Verification LBA range: start 0x0 length 0x800 00:10:00.333 Malloc1p1 : 5.20 615.40 2.40 0.00 0.00 206582.20 2493.22 173242.99 00:10:00.333 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.333 Verification LBA range: start 0x800 length 0x800 00:10:00.333 Malloc1p1 : 5.18 519.03 2.03 0.00 0.00 244688.26 3191.32 213362.42 00:10:00.334 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p0 : 5.20 615.15 2.40 0.00 0.00 206222.68 2478.97 171419.38 00:10:00.334 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p0 : 5.18 518.75 2.03 0.00 0.00 244209.42 3604.48 213362.42 00:10:00.334 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p1 : 5.20 614.80 2.40 0.00 0.00 205905.58 2991.86 171419.38 00:10:00.334 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p1 : 5.18 518.46 2.03 0.00 0.00 243669.65 4245.59 210627.01 00:10:00.334 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p2 : 5.21 614.55 2.40 0.00 0.00 205512.01 3761.20 168683.97 00:10:00.334 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p2 : 5.19 518.17 2.02 0.00 0.00 243004.99 3846.68 207891.59 00:10:00.334 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p3 : 5.21 614.30 2.40 0.00 0.00 205038.79 3191.32 166860.35 00:10:00.334 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p3 : 5.19 517.88 2.02 0.00 0.00 242427.24 3205.57 206067.98 00:10:00.334 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p4 : 5.21 614.03 2.40 0.00 0.00 204630.42 2478.97 166860.35 00:10:00.334 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p4 : 5.19 517.59 2.02 0.00 0.00 241963.51 3262.55 206067.98 00:10:00.334 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p5 : 5.21 613.77 2.40 0.00 0.00 204294.58 2507.46 168683.97 00:10:00.334 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p5 : 5.20 517.16 2.02 0.00 0.00 241558.23 3903.67 204244.37 00:10:00.334 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p6 : 5.22 613.52 2.40 0.00 0.00 203939.19 2550.21 170507.58 00:10:00.334 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p6 : 5.20 516.81 2.02 0.00 0.00 241007.37 2920.63 202420.76 00:10:00.334 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x200 00:10:00.334 Malloc2p7 : 5.22 613.27 2.40 0.00 0.00 203590.21 3348.03 169595.77 00:10:00.334 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x200 length 0x200 00:10:00.334 Malloc2p7 : 5.26 535.56 2.09 0.00 0.00 232008.06 4188.61 198773.54 00:10:00.334 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x1000 00:10:00.334 TestPT : 5.23 611.81 2.39 0.00 0.00 203492.31 9289.02 168683.97 00:10:00.334 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x1000 length 0x1000 00:10:00.334 TestPT : 5.24 512.50 2.00 0.00 0.00 241577.16 31457.28 199685.34 00:10:00.334 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x2000 00:10:00.334 raid0 : 5.22 612.74 2.39 0.00 0.00 202711.99 3761.20 158654.11 00:10:00.334 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x2000 length 0x2000 00:10:00.334 raid0 : 5.26 535.29 2.09 0.00 0.00 230708.55 3048.85 184184.65 00:10:00.334 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x2000 00:10:00.334 concat0 : 5.22 612.49 2.39 0.00 0.00 202232.09 2322.25 159565.91 00:10:00.334 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x2000 length 0x2000 00:10:00.334 concat0 : 5.26 535.03 2.09 0.00 0.00 230284.49 3433.52 178713.82 00:10:00.334 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x1000 00:10:00.334 raid1 : 5.23 612.23 2.39 0.00 0.00 201868.50 3105.84 168683.97 00:10:00.334 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x1000 length 0x1000 00:10:00.334 raid1 : 5.27 534.76 2.09 0.00 0.00 229712.17 4160.11 184184.65 00:10:00.334 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x0 length 0x4e2 00:10:00.334 AIO0 : 5.23 612.04 2.39 0.00 0.00 201524.35 1246.61 177802.02 00:10:00.334 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:00.334 Verification LBA range: start 0x4e2 length 0x4e2 00:10:00.334 AIO0 : 5.27 534.56 2.09 0.00 0.00 229091.74 1645.52 193302.71 00:10:00.334 =================================================================================================================== 00:10:00.334 Total : 19229.55 75.12 0.00 0.00 208571.19 733.72 401194.30 00:10:00.334 00:10:00.334 real 0m6.526s 00:10:00.334 user 0m12.062s 00:10:00.334 sys 0m0.421s 00:10:00.334 22:38:44 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.334 22:38:44 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:10:00.334 ************************************ 00:10:00.334 END TEST bdev_verify 00:10:00.334 ************************************ 00:10:00.334 22:38:45 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:00.334 22:38:45 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:00.334 22:38:45 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:00.334 22:38:45 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:00.334 22:38:45 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:00.334 ************************************ 00:10:00.334 START TEST bdev_verify_big_io 00:10:00.334 ************************************ 00:10:00.334 22:38:45 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:00.334 [2024-07-15 22:38:45.114883] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:00.334 [2024-07-15 22:38:45.114955] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686631 ] 00:10:00.593 [2024-07-15 22:38:45.245453] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:00.593 [2024-07-15 22:38:45.348799] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:00.593 [2024-07-15 22:38:45.348803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:00.851 [2024-07-15 22:38:45.509374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:00.851 [2024-07-15 22:38:45.509438] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:00.851 [2024-07-15 22:38:45.509453] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:00.851 [2024-07-15 22:38:45.517384] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:00.851 [2024-07-15 22:38:45.517411] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:00.851 [2024-07-15 22:38:45.525399] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:00.851 [2024-07-15 22:38:45.525423] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:00.851 [2024-07-15 22:38:45.598403] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:00.851 [2024-07-15 22:38:45.598456] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:00.851 [2024-07-15 22:38:45.598476] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e514d0 00:10:00.851 [2024-07-15 22:38:45.598488] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:00.851 [2024-07-15 22:38:45.600115] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:00.851 [2024-07-15 22:38:45.600145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:01.110 [2024-07-15 22:38:45.765197] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.766650] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.768729] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.770162] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.772216] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.773621] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.775355] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.776914] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.777959] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.779527] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.780552] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.782144] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.783228] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.784724] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.785581] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.786966] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:01.110 [2024-07-15 22:38:45.809746] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:01.110 [2024-07-15 22:38:45.811634] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:01.110 Running I/O for 5 seconds... 00:10:09.272 00:10:09.272 Latency(us) 00:10:09.272 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:09.272 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x100 00:10:09.272 Malloc0 : 5.96 128.82 8.05 0.00 0.00 973014.58 872.63 2874010.05 00:10:09.272 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x100 length 0x100 00:10:09.272 Malloc0 : 7.36 191.34 11.96 0.00 0.00 449176.72 1118.39 838860.80 00:10:09.272 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x80 00:10:09.272 Malloc1p0 : 6.78 35.42 2.21 0.00 0.00 3288627.63 1495.93 5543775.72 00:10:09.272 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x80 length 0x80 00:10:09.272 Malloc1p0 : 6.89 62.16 3.89 0.00 0.00 1969645.58 2949.12 3909820.77 00:10:09.272 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x80 00:10:09.272 Malloc1p1 : 6.78 35.41 2.21 0.00 0.00 3174947.48 1524.42 5339531.35 00:10:09.272 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x80 length 0x80 00:10:09.272 Malloc1p1 : 7.36 28.25 1.77 0.00 0.00 4139873.19 1894.85 6564997.57 00:10:09.272 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p0 : 6.42 24.91 1.56 0.00 0.00 1135873.46 637.55 1881965.97 00:10:09.272 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p0 : 6.89 16.26 1.02 0.00 0.00 1754295.51 783.58 2903187.81 00:10:09.272 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p1 : 6.43 24.90 1.56 0.00 0.00 1124580.23 648.24 1845493.76 00:10:09.272 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p1 : 6.89 16.25 1.02 0.00 0.00 1736027.57 780.02 2874010.05 00:10:09.272 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p2 : 6.43 24.89 1.56 0.00 0.00 1114463.71 630.43 1823610.43 00:10:09.272 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p2 : 6.89 16.25 1.02 0.00 0.00 1718273.13 787.14 2830243.39 00:10:09.272 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p3 : 6.43 24.89 1.56 0.00 0.00 1103451.10 858.38 1794432.67 00:10:09.272 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p3 : 6.89 16.25 1.02 0.00 0.00 1699628.47 790.71 2801065.63 00:10:09.272 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p4 : 6.43 24.88 1.56 0.00 0.00 1092303.53 655.36 1765254.90 00:10:09.272 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p4 : 6.90 16.24 1.02 0.00 0.00 1681749.70 975.92 2771887.86 00:10:09.272 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p5 : 6.43 24.88 1.55 0.00 0.00 1082190.12 651.80 1736077.13 00:10:09.272 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p5 : 6.90 16.24 1.01 0.00 0.00 1661996.03 790.71 2742710.09 00:10:09.272 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p6 : 6.43 24.87 1.55 0.00 0.00 1070703.96 651.80 1706899.37 00:10:09.272 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p6 : 6.90 16.24 1.01 0.00 0.00 1644166.95 787.14 2698943.44 00:10:09.272 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x20 00:10:09.272 Malloc2p7 : 6.43 24.87 1.55 0.00 0.00 1059788.10 633.99 1677721.60 00:10:09.272 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x20 length 0x20 00:10:09.272 Malloc2p7 : 6.90 16.23 1.01 0.00 0.00 1624178.91 794.27 2669765.68 00:10:09.272 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x100 00:10:09.272 TestPT : 6.89 34.85 2.18 0.00 0.00 2886140.85 112152.04 3938998.54 00:10:09.272 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x100 length 0x100 00:10:09.272 TestPT : 7.36 26.34 1.65 0.00 0.00 3917040.01 260776.29 4289131.74 00:10:09.272 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x200 00:10:09.272 raid0 : 6.72 42.87 2.68 0.00 0.00 2314025.22 1574.29 4697620.48 00:10:09.272 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x200 length 0x200 00:10:09.272 raid0 : 7.35 30.47 1.90 0.00 0.00 3257438.80 1994.57 5514597.95 00:10:09.272 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x200 00:10:09.272 concat0 : 7.06 43.07 2.69 0.00 0.00 2180217.16 1588.54 4522553.88 00:10:09.272 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x200 length 0x200 00:10:09.272 concat0 : 7.28 32.96 2.06 0.00 0.00 2914509.54 1994.57 5251998.05 00:10:09.272 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x100 00:10:09.272 raid1 : 6.93 55.41 3.46 0.00 0.00 1680433.96 2065.81 4318309.51 00:10:09.272 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x100 length 0x100 00:10:09.272 raid1 : 7.28 32.95 2.06 0.00 0.00 2776618.24 2592.95 4989398.15 00:10:09.272 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x0 length 0x4e 00:10:09.272 AIO0 : 7.06 64.86 4.05 0.00 0.00 852948.22 808.51 2611410.14 00:10:09.272 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:09.272 Verification LBA range: start 0x4e length 0x4e 00:10:09.272 AIO0 : 7.35 36.44 2.28 0.00 0.00 1489229.65 990.16 4055709.61 00:10:09.272 =================================================================================================================== 00:10:09.272 Total : 1210.70 75.67 0.00 0.00 1669123.78 630.43 6564997.57 00:10:09.272 00:10:09.272 real 0m8.645s 00:10:09.272 user 0m16.298s 00:10:09.272 sys 0m0.418s 00:10:09.272 22:38:53 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:09.272 22:38:53 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:09.272 ************************************ 00:10:09.272 END TEST bdev_verify_big_io 00:10:09.273 ************************************ 00:10:09.273 22:38:53 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:09.273 22:38:53 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:09.273 22:38:53 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:09.273 22:38:53 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:09.273 22:38:53 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:09.273 ************************************ 00:10:09.273 START TEST bdev_write_zeroes 00:10:09.273 ************************************ 00:10:09.273 22:38:53 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:09.273 [2024-07-15 22:38:53.889456] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:09.273 [2024-07-15 22:38:53.889585] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687779 ] 00:10:09.273 [2024-07-15 22:38:54.087579] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:09.532 [2024-07-15 22:38:54.193413] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:09.532 [2024-07-15 22:38:54.343968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:09.532 [2024-07-15 22:38:54.344029] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:09.532 [2024-07-15 22:38:54.344063] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:09.532 [2024-07-15 22:38:54.351975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:09.532 [2024-07-15 22:38:54.352002] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:09.532 [2024-07-15 22:38:54.359979] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:09.532 [2024-07-15 22:38:54.360003] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:09.532 [2024-07-15 22:38:54.432228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:09.532 [2024-07-15 22:38:54.432277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:09.532 [2024-07-15 22:38:54.432295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13b7c10 00:10:09.532 [2024-07-15 22:38:54.432308] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:09.532 [2024-07-15 22:38:54.433810] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:09.532 [2024-07-15 22:38:54.433838] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:09.790 Running I/O for 1 seconds... 00:10:11.167 00:10:11.167 Latency(us) 00:10:11.167 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:11.167 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc0 : 1.04 4946.50 19.32 0.00 0.00 25865.06 662.48 43082.80 00:10:11.167 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc1p0 : 1.04 4939.36 19.29 0.00 0.00 25857.61 911.81 42398.94 00:10:11.167 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc1p1 : 1.04 4932.25 19.27 0.00 0.00 25839.35 901.12 41487.14 00:10:11.167 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p0 : 1.04 4925.12 19.24 0.00 0.00 25821.16 911.81 40575.33 00:10:11.167 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p1 : 1.04 4918.05 19.21 0.00 0.00 25799.43 911.81 39663.53 00:10:11.167 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p2 : 1.04 4911.03 19.18 0.00 0.00 25777.17 911.81 38751.72 00:10:11.167 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p3 : 1.04 4903.96 19.16 0.00 0.00 25755.26 904.68 37839.92 00:10:11.167 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p4 : 1.05 4896.99 19.13 0.00 0.00 25733.41 904.68 36928.11 00:10:11.167 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p5 : 1.05 4890.02 19.10 0.00 0.00 25711.88 908.24 36016.31 00:10:11.167 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p6 : 1.05 4883.03 19.07 0.00 0.00 25695.55 911.81 35332.45 00:10:11.167 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 Malloc2p7 : 1.06 4939.44 19.29 0.00 0.00 25355.12 897.56 34420.65 00:10:11.167 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 TestPT : 1.06 4932.47 19.27 0.00 0.00 25335.41 947.42 33508.84 00:10:11.167 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 raid0 : 1.07 4924.52 19.24 0.00 0.00 25305.46 1631.28 31913.18 00:10:11.167 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 concat0 : 1.07 4916.69 19.21 0.00 0.00 25245.15 1617.03 30089.57 00:10:11.167 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 raid1 : 1.07 4906.92 19.17 0.00 0.00 25190.22 2578.70 27582.11 00:10:11.167 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:11.167 AIO0 : 1.07 4900.97 19.14 0.00 0.00 25095.46 1054.27 27240.18 00:10:11.167 =================================================================================================================== 00:10:11.167 Total : 78667.32 307.29 0.00 0.00 25583.34 662.48 43082.80 00:10:11.427 00:10:11.427 real 0m2.302s 00:10:11.427 user 0m1.855s 00:10:11.427 sys 0m0.386s 00:10:11.427 22:38:56 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:11.427 22:38:56 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:11.427 ************************************ 00:10:11.427 END TEST bdev_write_zeroes 00:10:11.427 ************************************ 00:10:11.427 22:38:56 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:11.427 22:38:56 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:11.427 22:38:56 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:11.427 22:38:56 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.427 22:38:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:11.427 ************************************ 00:10:11.427 START TEST bdev_json_nonenclosed 00:10:11.427 ************************************ 00:10:11.427 22:38:56 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:11.427 [2024-07-15 22:38:56.225013] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:11.427 [2024-07-15 22:38:56.225076] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688147 ] 00:10:11.686 [2024-07-15 22:38:56.351867] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:11.687 [2024-07-15 22:38:56.452380] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:11.687 [2024-07-15 22:38:56.452453] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:11.687 [2024-07-15 22:38:56.452475] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:11.687 [2024-07-15 22:38:56.452488] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:11.687 00:10:11.687 real 0m0.389s 00:10:11.687 user 0m0.243s 00:10:11.687 sys 0m0.143s 00:10:11.687 22:38:56 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:10:11.687 22:38:56 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:11.687 22:38:56 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:11.687 ************************************ 00:10:11.687 END TEST bdev_json_nonenclosed 00:10:11.687 ************************************ 00:10:11.946 22:38:56 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:11.946 22:38:56 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:10:11.946 22:38:56 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:11.946 22:38:56 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:11.946 22:38:56 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:11.946 22:38:56 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:11.946 ************************************ 00:10:11.946 START TEST bdev_json_nonarray 00:10:11.946 ************************************ 00:10:11.946 22:38:56 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:11.946 [2024-07-15 22:38:56.700046] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:11.946 [2024-07-15 22:38:56.700106] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688170 ] 00:10:11.946 [2024-07-15 22:38:56.827811] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.206 [2024-07-15 22:38:56.928336] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:12.206 [2024-07-15 22:38:56.928418] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:10:12.206 [2024-07-15 22:38:56.928440] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:12.206 [2024-07-15 22:38:56.928453] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:12.206 00:10:12.206 real 0m0.394s 00:10:12.206 user 0m0.239s 00:10:12.206 sys 0m0.152s 00:10:12.206 22:38:57 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:10:12.206 22:38:57 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:12.206 22:38:57 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:10:12.206 ************************************ 00:10:12.206 END TEST bdev_json_nonarray 00:10:12.206 ************************************ 00:10:12.206 22:38:57 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:12.206 22:38:57 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:10:12.206 22:38:57 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:10:12.206 22:38:57 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:10:12.206 22:38:57 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:12.206 22:38:57 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:12.206 22:38:57 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:12.206 ************************************ 00:10:12.206 START TEST bdev_qos 00:10:12.206 ************************************ 00:10:12.206 22:38:57 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:10:12.206 22:38:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=2688201 00:10:12.206 22:38:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 2688201' 00:10:12.206 Process qos testing pid: 2688201 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 2688201 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 2688201 ']' 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:12.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:12.474 22:38:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:12.474 [2024-07-15 22:38:57.175845] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:12.474 [2024-07-15 22:38:57.175912] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688201 ] 00:10:12.474 [2024-07-15 22:38:57.310666] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.733 [2024-07-15 22:38:57.428337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:13.304 Malloc_0 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:13.304 [ 00:10:13.304 { 00:10:13.304 "name": "Malloc_0", 00:10:13.304 "aliases": [ 00:10:13.304 "30ba4719-a146-4b7c-bda6-26dfaf6b68eb" 00:10:13.304 ], 00:10:13.304 "product_name": "Malloc disk", 00:10:13.304 "block_size": 512, 00:10:13.304 "num_blocks": 262144, 00:10:13.304 "uuid": "30ba4719-a146-4b7c-bda6-26dfaf6b68eb", 00:10:13.304 "assigned_rate_limits": { 00:10:13.304 "rw_ios_per_sec": 0, 00:10:13.304 "rw_mbytes_per_sec": 0, 00:10:13.304 "r_mbytes_per_sec": 0, 00:10:13.304 "w_mbytes_per_sec": 0 00:10:13.304 }, 00:10:13.304 "claimed": false, 00:10:13.304 "zoned": false, 00:10:13.304 "supported_io_types": { 00:10:13.304 "read": true, 00:10:13.304 "write": true, 00:10:13.304 "unmap": true, 00:10:13.304 "flush": true, 00:10:13.304 "reset": true, 00:10:13.304 "nvme_admin": false, 00:10:13.304 "nvme_io": false, 00:10:13.304 "nvme_io_md": false, 00:10:13.304 "write_zeroes": true, 00:10:13.304 "zcopy": true, 00:10:13.304 "get_zone_info": false, 00:10:13.304 "zone_management": false, 00:10:13.304 "zone_append": false, 00:10:13.304 "compare": false, 00:10:13.304 "compare_and_write": false, 00:10:13.304 "abort": true, 00:10:13.304 "seek_hole": false, 00:10:13.304 "seek_data": false, 00:10:13.304 "copy": true, 00:10:13.304 "nvme_iov_md": false 00:10:13.304 }, 00:10:13.304 "memory_domains": [ 00:10:13.304 { 00:10:13.304 "dma_device_id": "system", 00:10:13.304 "dma_device_type": 1 00:10:13.304 }, 00:10:13.304 { 00:10:13.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:13.304 "dma_device_type": 2 00:10:13.304 } 00:10:13.304 ], 00:10:13.304 "driver_specific": {} 00:10:13.304 } 00:10:13.304 ] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:13.304 Null_1 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:13.304 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:13.304 [ 00:10:13.304 { 00:10:13.304 "name": "Null_1", 00:10:13.304 "aliases": [ 00:10:13.564 "8e8c37bc-a2d4-4fce-be08-4702e675c8af" 00:10:13.564 ], 00:10:13.564 "product_name": "Null disk", 00:10:13.564 "block_size": 512, 00:10:13.564 "num_blocks": 262144, 00:10:13.564 "uuid": "8e8c37bc-a2d4-4fce-be08-4702e675c8af", 00:10:13.564 "assigned_rate_limits": { 00:10:13.564 "rw_ios_per_sec": 0, 00:10:13.564 "rw_mbytes_per_sec": 0, 00:10:13.564 "r_mbytes_per_sec": 0, 00:10:13.564 "w_mbytes_per_sec": 0 00:10:13.564 }, 00:10:13.564 "claimed": false, 00:10:13.564 "zoned": false, 00:10:13.564 "supported_io_types": { 00:10:13.564 "read": true, 00:10:13.564 "write": true, 00:10:13.564 "unmap": false, 00:10:13.564 "flush": false, 00:10:13.564 "reset": true, 00:10:13.564 "nvme_admin": false, 00:10:13.564 "nvme_io": false, 00:10:13.564 "nvme_io_md": false, 00:10:13.564 "write_zeroes": true, 00:10:13.564 "zcopy": false, 00:10:13.564 "get_zone_info": false, 00:10:13.564 "zone_management": false, 00:10:13.564 "zone_append": false, 00:10:13.564 "compare": false, 00:10:13.564 "compare_and_write": false, 00:10:13.564 "abort": true, 00:10:13.564 "seek_hole": false, 00:10:13.564 "seek_data": false, 00:10:13.564 "copy": false, 00:10:13.564 "nvme_iov_md": false 00:10:13.564 }, 00:10:13.564 "driver_specific": {} 00:10:13.564 } 00:10:13.564 ] 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:13.564 22:38:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:13.564 Running I/O for 60 seconds... 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 48532.19 194128.74 0.00 0.00 195584.00 0.00 0.00 ' 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=48532.19 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 48532 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=48532 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=12000 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 12000 -gt 1000 ']' 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 12000 Malloc_0 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 12000 IOPS Malloc_0 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:18.845 22:39:03 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:18.845 ************************************ 00:10:18.845 START TEST bdev_qos_iops 00:10:18.845 ************************************ 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 12000 IOPS Malloc_0 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=12000 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:18.845 22:39:03 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 12003.15 48012.60 0.00 0.00 48816.00 0.00 0.00 ' 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=12003.15 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 12003 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=12003 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=10800 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=13200 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 12003 -lt 10800 ']' 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 12003 -gt 13200 ']' 00:10:24.126 00:10:24.126 real 0m5.267s 00:10:24.126 user 0m0.123s 00:10:24.126 sys 0m0.043s 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:24.126 22:39:08 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:24.126 ************************************ 00:10:24.126 END TEST bdev_qos_iops 00:10:24.126 ************************************ 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:24.126 22:39:08 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 15344.61 61378.46 0.00 0.00 62464.00 0.00 0.00 ' 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=62464.00 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 62464 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=62464 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=6 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 6 -lt 2 ']' 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:29.471 22:39:14 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:29.471 ************************************ 00:10:29.471 START TEST bdev_qos_bw 00:10:29.471 ************************************ 00:10:29.471 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 6 BANDWIDTH Null_1 00:10:29.471 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=6 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:29.472 22:39:14 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1536.89 6147.55 0.00 0.00 6348.00 0.00 0.00 ' 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=6348.00 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 6348 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=6348 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=6144 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=5529 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=6758 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6348 -lt 5529 ']' 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6348 -gt 6758 ']' 00:10:34.747 00:10:34.747 real 0m5.332s 00:10:34.747 user 0m0.125s 00:10:34.747 sys 0m0.043s 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:34.747 ************************************ 00:10:34.747 END TEST bdev_qos_bw 00:10:34.747 ************************************ 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:34.747 22:39:19 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:34.747 ************************************ 00:10:34.747 START TEST bdev_qos_ro_bw 00:10:34.747 ************************************ 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:34.747 22:39:19 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.25 2045.00 0.00 0.00 2052.00 0.00 0.00 ' 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2052.00 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2052 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2052 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -lt 1843 ']' 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -gt 2252 ']' 00:10:40.019 00:10:40.019 real 0m5.182s 00:10:40.019 user 0m0.111s 00:10:40.019 sys 0m0.054s 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:40.019 22:39:24 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:40.019 ************************************ 00:10:40.019 END TEST bdev_qos_ro_bw 00:10:40.019 ************************************ 00:10:40.019 22:39:24 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:40.019 22:39:24 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:40.019 22:39:24 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:40.019 22:39:24 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:40.587 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:40.587 22:39:25 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:10:40.587 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:40.588 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:40.846 00:10:40.846 Latency(us) 00:10:40.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:40.846 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:40.846 Malloc_0 : 26.96 16398.56 64.06 0.00 0.00 15465.01 2550.21 503316.48 00:10:40.846 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:40.846 Null_1 : 27.16 15717.42 61.40 0.00 0.00 16228.18 1040.03 198773.54 00:10:40.846 =================================================================================================================== 00:10:40.846 Total : 32115.98 125.45 0.00 0.00 15839.91 1040.03 503316.48 00:10:40.846 0 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 2688201 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 2688201 ']' 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 2688201 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2688201 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2688201' 00:10:40.846 killing process with pid 2688201 00:10:40.846 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 2688201 00:10:40.846 Received shutdown signal, test time was about 27.227725 seconds 00:10:40.846 00:10:40.847 Latency(us) 00:10:40.847 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:40.847 =================================================================================================================== 00:10:40.847 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:40.847 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 2688201 00:10:41.105 22:39:25 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:10:41.105 00:10:41.105 real 0m28.821s 00:10:41.105 user 0m29.552s 00:10:41.105 sys 0m0.902s 00:10:41.105 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:41.106 22:39:25 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:41.106 ************************************ 00:10:41.106 END TEST bdev_qos 00:10:41.106 ************************************ 00:10:41.106 22:39:25 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:41.106 22:39:25 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:41.106 22:39:25 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:41.106 22:39:25 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:41.106 22:39:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:41.365 ************************************ 00:10:41.365 START TEST bdev_qd_sampling 00:10:41.365 ************************************ 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=2692583 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 2692583' 00:10:41.365 Process bdev QD sampling period testing pid: 2692583 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 2692583 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 2692583 ']' 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:41.365 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:41.365 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:41.365 [2024-07-15 22:39:26.091196] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:41.365 [2024-07-15 22:39:26.091274] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2692583 ] 00:10:41.365 [2024-07-15 22:39:26.223080] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:41.624 [2024-07-15 22:39:26.326484] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:41.624 [2024-07-15 22:39:26.326489] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:42.191 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:42.191 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:10:42.191 22:39:26 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:42.191 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:42.191 22:39:26 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:42.191 Malloc_QD 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:42.191 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:42.191 [ 00:10:42.191 { 00:10:42.191 "name": "Malloc_QD", 00:10:42.191 "aliases": [ 00:10:42.191 "933ea51c-926d-42b3-a5c7-fa2947f32168" 00:10:42.191 ], 00:10:42.191 "product_name": "Malloc disk", 00:10:42.191 "block_size": 512, 00:10:42.191 "num_blocks": 262144, 00:10:42.191 "uuid": "933ea51c-926d-42b3-a5c7-fa2947f32168", 00:10:42.191 "assigned_rate_limits": { 00:10:42.191 "rw_ios_per_sec": 0, 00:10:42.191 "rw_mbytes_per_sec": 0, 00:10:42.191 "r_mbytes_per_sec": 0, 00:10:42.191 "w_mbytes_per_sec": 0 00:10:42.191 }, 00:10:42.191 "claimed": false, 00:10:42.191 "zoned": false, 00:10:42.191 "supported_io_types": { 00:10:42.191 "read": true, 00:10:42.191 "write": true, 00:10:42.191 "unmap": true, 00:10:42.191 "flush": true, 00:10:42.191 "reset": true, 00:10:42.191 "nvme_admin": false, 00:10:42.191 "nvme_io": false, 00:10:42.191 "nvme_io_md": false, 00:10:42.191 "write_zeroes": true, 00:10:42.191 "zcopy": true, 00:10:42.191 "get_zone_info": false, 00:10:42.191 "zone_management": false, 00:10:42.192 "zone_append": false, 00:10:42.192 "compare": false, 00:10:42.192 "compare_and_write": false, 00:10:42.192 "abort": true, 00:10:42.192 "seek_hole": false, 00:10:42.192 "seek_data": false, 00:10:42.192 "copy": true, 00:10:42.192 "nvme_iov_md": false 00:10:42.192 }, 00:10:42.192 "memory_domains": [ 00:10:42.192 { 00:10:42.192 "dma_device_id": "system", 00:10:42.192 "dma_device_type": 1 00:10:42.192 }, 00:10:42.192 { 00:10:42.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:42.192 "dma_device_type": 2 00:10:42.192 } 00:10:42.192 ], 00:10:42.192 "driver_specific": {} 00:10:42.192 } 00:10:42.192 ] 00:10:42.192 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:42.192 22:39:27 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:10:42.192 22:39:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:10:42.192 22:39:27 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:42.450 Running I/O for 5 seconds... 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:10:44.352 "tick_rate": 2300000000, 00:10:44.352 "ticks": 5435430716957774, 00:10:44.352 "bdevs": [ 00:10:44.352 { 00:10:44.352 "name": "Malloc_QD", 00:10:44.352 "bytes_read": 703640064, 00:10:44.352 "num_read_ops": 171780, 00:10:44.352 "bytes_written": 0, 00:10:44.352 "num_write_ops": 0, 00:10:44.352 "bytes_unmapped": 0, 00:10:44.352 "num_unmap_ops": 0, 00:10:44.352 "bytes_copied": 0, 00:10:44.352 "num_copy_ops": 0, 00:10:44.352 "read_latency_ticks": 2236653993582, 00:10:44.352 "max_read_latency_ticks": 18196294, 00:10:44.352 "min_read_latency_ticks": 276970, 00:10:44.352 "write_latency_ticks": 0, 00:10:44.352 "max_write_latency_ticks": 0, 00:10:44.352 "min_write_latency_ticks": 0, 00:10:44.352 "unmap_latency_ticks": 0, 00:10:44.352 "max_unmap_latency_ticks": 0, 00:10:44.352 "min_unmap_latency_ticks": 0, 00:10:44.352 "copy_latency_ticks": 0, 00:10:44.352 "max_copy_latency_ticks": 0, 00:10:44.352 "min_copy_latency_ticks": 0, 00:10:44.352 "io_error": {}, 00:10:44.352 "queue_depth_polling_period": 10, 00:10:44.352 "queue_depth": 768, 00:10:44.352 "io_time": 30, 00:10:44.352 "weighted_io_time": 17920 00:10:44.352 } 00:10:44.352 ] 00:10:44.352 }' 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:44.352 00:10:44.352 Latency(us) 00:10:44.352 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:44.352 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:44.352 Malloc_QD : 1.98 49840.79 194.69 0.00 0.00 5123.70 1417.57 5442.34 00:10:44.352 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:44.352 Malloc_QD : 1.99 40492.21 158.17 0.00 0.00 6305.55 1239.49 7921.31 00:10:44.352 =================================================================================================================== 00:10:44.352 Total : 90333.00 352.86 0.00 0.00 5653.84 1239.49 7921.31 00:10:44.352 0 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 2692583 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 2692583 ']' 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 2692583 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2692583 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2692583' 00:10:44.352 killing process with pid 2692583 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 2692583 00:10:44.352 Received shutdown signal, test time was about 2.061895 seconds 00:10:44.352 00:10:44.352 Latency(us) 00:10:44.352 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:44.352 =================================================================================================================== 00:10:44.352 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:44.352 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 2692583 00:10:44.611 22:39:29 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:10:44.611 00:10:44.611 real 0m3.412s 00:10:44.611 user 0m6.630s 00:10:44.611 sys 0m0.439s 00:10:44.611 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:44.611 22:39:29 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:44.611 ************************************ 00:10:44.611 END TEST bdev_qd_sampling 00:10:44.611 ************************************ 00:10:44.611 22:39:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:44.611 22:39:29 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:10:44.611 22:39:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:44.611 22:39:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:44.611 22:39:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:44.870 ************************************ 00:10:44.870 START TEST bdev_error 00:10:44.870 ************************************ 00:10:44.870 22:39:29 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:10:44.870 22:39:29 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:10:44.870 22:39:29 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:10:44.870 22:39:29 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:10:44.870 22:39:29 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=2693042 00:10:44.870 22:39:29 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 2693042' 00:10:44.870 Process error testing pid: 2693042 00:10:44.870 22:39:29 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:44.870 22:39:29 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 2693042 00:10:44.870 22:39:29 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2693042 ']' 00:10:44.870 22:39:29 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:44.870 22:39:29 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:44.870 22:39:29 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:44.870 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:44.870 22:39:29 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:44.870 22:39:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:44.870 [2024-07-15 22:39:29.588284] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:44.870 [2024-07-15 22:39:29.588354] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2693042 ] 00:10:44.870 [2024-07-15 22:39:29.725019] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:45.129 [2024-07-15 22:39:29.858924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:45.697 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.697 Dev_1 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.697 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.697 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.697 [ 00:10:45.697 { 00:10:45.697 "name": "Dev_1", 00:10:45.697 "aliases": [ 00:10:45.697 "40d942f1-8b3b-49a1-9477-3d8c0bf712a9" 00:10:45.697 ], 00:10:45.697 "product_name": "Malloc disk", 00:10:45.697 "block_size": 512, 00:10:45.697 "num_blocks": 262144, 00:10:45.697 "uuid": "40d942f1-8b3b-49a1-9477-3d8c0bf712a9", 00:10:45.697 "assigned_rate_limits": { 00:10:45.697 "rw_ios_per_sec": 0, 00:10:45.697 "rw_mbytes_per_sec": 0, 00:10:45.697 "r_mbytes_per_sec": 0, 00:10:45.697 "w_mbytes_per_sec": 0 00:10:45.697 }, 00:10:45.697 "claimed": false, 00:10:45.697 "zoned": false, 00:10:45.697 "supported_io_types": { 00:10:45.697 "read": true, 00:10:45.697 "write": true, 00:10:45.697 "unmap": true, 00:10:45.697 "flush": true, 00:10:45.697 "reset": true, 00:10:45.956 "nvme_admin": false, 00:10:45.956 "nvme_io": false, 00:10:45.956 "nvme_io_md": false, 00:10:45.956 "write_zeroes": true, 00:10:45.956 "zcopy": true, 00:10:45.956 "get_zone_info": false, 00:10:45.956 "zone_management": false, 00:10:45.956 "zone_append": false, 00:10:45.956 "compare": false, 00:10:45.956 "compare_and_write": false, 00:10:45.956 "abort": true, 00:10:45.956 "seek_hole": false, 00:10:45.956 "seek_data": false, 00:10:45.956 "copy": true, 00:10:45.956 "nvme_iov_md": false 00:10:45.956 }, 00:10:45.956 "memory_domains": [ 00:10:45.956 { 00:10:45.956 "dma_device_id": "system", 00:10:45.956 "dma_device_type": 1 00:10:45.956 }, 00:10:45.956 { 00:10:45.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.956 "dma_device_type": 2 00:10:45.956 } 00:10:45.957 ], 00:10:45.957 "driver_specific": {} 00:10:45.957 } 00:10:45.957 ] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:45.957 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.957 true 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.957 Dev_2 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.957 [ 00:10:45.957 { 00:10:45.957 "name": "Dev_2", 00:10:45.957 "aliases": [ 00:10:45.957 "6477aa77-70b9-459a-8038-3ea3f5035bfd" 00:10:45.957 ], 00:10:45.957 "product_name": "Malloc disk", 00:10:45.957 "block_size": 512, 00:10:45.957 "num_blocks": 262144, 00:10:45.957 "uuid": "6477aa77-70b9-459a-8038-3ea3f5035bfd", 00:10:45.957 "assigned_rate_limits": { 00:10:45.957 "rw_ios_per_sec": 0, 00:10:45.957 "rw_mbytes_per_sec": 0, 00:10:45.957 "r_mbytes_per_sec": 0, 00:10:45.957 "w_mbytes_per_sec": 0 00:10:45.957 }, 00:10:45.957 "claimed": false, 00:10:45.957 "zoned": false, 00:10:45.957 "supported_io_types": { 00:10:45.957 "read": true, 00:10:45.957 "write": true, 00:10:45.957 "unmap": true, 00:10:45.957 "flush": true, 00:10:45.957 "reset": true, 00:10:45.957 "nvme_admin": false, 00:10:45.957 "nvme_io": false, 00:10:45.957 "nvme_io_md": false, 00:10:45.957 "write_zeroes": true, 00:10:45.957 "zcopy": true, 00:10:45.957 "get_zone_info": false, 00:10:45.957 "zone_management": false, 00:10:45.957 "zone_append": false, 00:10:45.957 "compare": false, 00:10:45.957 "compare_and_write": false, 00:10:45.957 "abort": true, 00:10:45.957 "seek_hole": false, 00:10:45.957 "seek_data": false, 00:10:45.957 "copy": true, 00:10:45.957 "nvme_iov_md": false 00:10:45.957 }, 00:10:45.957 "memory_domains": [ 00:10:45.957 { 00:10:45.957 "dma_device_id": "system", 00:10:45.957 "dma_device_type": 1 00:10:45.957 }, 00:10:45.957 { 00:10:45.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:45.957 "dma_device_type": 2 00:10:45.957 } 00:10:45.957 ], 00:10:45.957 "driver_specific": {} 00:10:45.957 } 00:10:45.957 ] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:45.957 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:45.957 22:39:30 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:45.957 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:10:45.957 22:39:30 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:45.957 Running I/O for 5 seconds... 00:10:46.893 22:39:31 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 2693042 00:10:46.893 22:39:31 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 2693042' 00:10:46.893 Process is existed as continue on error is set. Pid: 2693042 00:10:46.893 22:39:31 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:46.893 22:39:31 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:46.893 22:39:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.893 22:39:31 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:46.893 22:39:31 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:46.893 22:39:31 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:46.893 22:39:31 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:46.893 22:39:31 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:46.893 22:39:31 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:10:46.893 Timeout while waiting for response: 00:10:46.893 00:10:46.893 00:10:51.127 00:10:51.127 Latency(us) 00:10:51.127 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:51.127 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:51.127 EE_Dev_1 : 0.92 29007.65 113.31 5.42 0.00 546.96 165.62 876.19 00:10:51.127 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:51.127 Dev_2 : 5.00 62468.41 244.02 0.00 0.00 251.61 85.93 31229.33 00:10:51.127 =================================================================================================================== 00:10:51.127 Total : 91476.06 357.33 5.42 0.00 274.91 85.93 31229.33 00:10:52.064 22:39:36 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 2693042 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 2693042 ']' 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 2693042 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2693042 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2693042' 00:10:52.064 killing process with pid 2693042 00:10:52.064 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 2693042 00:10:52.064 Received shutdown signal, test time was about 5.000000 seconds 00:10:52.064 00:10:52.064 Latency(us) 00:10:52.065 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:52.065 =================================================================================================================== 00:10:52.065 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:52.065 22:39:36 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 2693042 00:10:52.324 22:39:37 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=2694083 00:10:52.324 22:39:37 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 2694083' 00:10:52.324 22:39:37 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:52.324 Process error testing pid: 2694083 00:10:52.324 22:39:37 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 2694083 00:10:52.324 22:39:37 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2694083 ']' 00:10:52.324 22:39:37 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:52.324 22:39:37 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:52.324 22:39:37 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:52.324 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:52.324 22:39:37 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:52.324 22:39:37 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:52.324 [2024-07-15 22:39:37.171791] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:52.324 [2024-07-15 22:39:37.171862] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694083 ] 00:10:52.584 [2024-07-15 22:39:37.306281] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:52.584 [2024-07-15 22:39:37.425493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 Dev_1 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 [ 00:10:53.523 { 00:10:53.523 "name": "Dev_1", 00:10:53.523 "aliases": [ 00:10:53.523 "7cf1acd8-5ca1-4d67-bde6-24f79e2e9561" 00:10:53.523 ], 00:10:53.523 "product_name": "Malloc disk", 00:10:53.523 "block_size": 512, 00:10:53.523 "num_blocks": 262144, 00:10:53.523 "uuid": "7cf1acd8-5ca1-4d67-bde6-24f79e2e9561", 00:10:53.523 "assigned_rate_limits": { 00:10:53.523 "rw_ios_per_sec": 0, 00:10:53.523 "rw_mbytes_per_sec": 0, 00:10:53.523 "r_mbytes_per_sec": 0, 00:10:53.523 "w_mbytes_per_sec": 0 00:10:53.523 }, 00:10:53.523 "claimed": false, 00:10:53.523 "zoned": false, 00:10:53.523 "supported_io_types": { 00:10:53.523 "read": true, 00:10:53.523 "write": true, 00:10:53.523 "unmap": true, 00:10:53.523 "flush": true, 00:10:53.523 "reset": true, 00:10:53.523 "nvme_admin": false, 00:10:53.523 "nvme_io": false, 00:10:53.523 "nvme_io_md": false, 00:10:53.523 "write_zeroes": true, 00:10:53.523 "zcopy": true, 00:10:53.523 "get_zone_info": false, 00:10:53.523 "zone_management": false, 00:10:53.523 "zone_append": false, 00:10:53.523 "compare": false, 00:10:53.523 "compare_and_write": false, 00:10:53.523 "abort": true, 00:10:53.523 "seek_hole": false, 00:10:53.523 "seek_data": false, 00:10:53.523 "copy": true, 00:10:53.523 "nvme_iov_md": false 00:10:53.523 }, 00:10:53.523 "memory_domains": [ 00:10:53.523 { 00:10:53.523 "dma_device_id": "system", 00:10:53.523 "dma_device_type": 1 00:10:53.523 }, 00:10:53.523 { 00:10:53.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.523 "dma_device_type": 2 00:10:53.523 } 00:10:53.523 ], 00:10:53.523 "driver_specific": {} 00:10:53.523 } 00:10:53.523 ] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 true 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 Dev_2 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 [ 00:10:53.523 { 00:10:53.523 "name": "Dev_2", 00:10:53.523 "aliases": [ 00:10:53.523 "4249e466-8744-4c0f-861d-b5c3c1d86486" 00:10:53.523 ], 00:10:53.523 "product_name": "Malloc disk", 00:10:53.523 "block_size": 512, 00:10:53.523 "num_blocks": 262144, 00:10:53.523 "uuid": "4249e466-8744-4c0f-861d-b5c3c1d86486", 00:10:53.523 "assigned_rate_limits": { 00:10:53.523 "rw_ios_per_sec": 0, 00:10:53.523 "rw_mbytes_per_sec": 0, 00:10:53.523 "r_mbytes_per_sec": 0, 00:10:53.523 "w_mbytes_per_sec": 0 00:10:53.523 }, 00:10:53.523 "claimed": false, 00:10:53.523 "zoned": false, 00:10:53.523 "supported_io_types": { 00:10:53.523 "read": true, 00:10:53.523 "write": true, 00:10:53.523 "unmap": true, 00:10:53.523 "flush": true, 00:10:53.523 "reset": true, 00:10:53.523 "nvme_admin": false, 00:10:53.523 "nvme_io": false, 00:10:53.523 "nvme_io_md": false, 00:10:53.523 "write_zeroes": true, 00:10:53.523 "zcopy": true, 00:10:53.523 "get_zone_info": false, 00:10:53.523 "zone_management": false, 00:10:53.523 "zone_append": false, 00:10:53.523 "compare": false, 00:10:53.523 "compare_and_write": false, 00:10:53.523 "abort": true, 00:10:53.523 "seek_hole": false, 00:10:53.523 "seek_data": false, 00:10:53.523 "copy": true, 00:10:53.523 "nvme_iov_md": false 00:10:53.523 }, 00:10:53.523 "memory_domains": [ 00:10:53.523 { 00:10:53.523 "dma_device_id": "system", 00:10:53.523 "dma_device_type": 1 00:10:53.523 }, 00:10:53.523 { 00:10:53.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.523 "dma_device_type": 2 00:10:53.523 } 00:10:53.523 ], 00:10:53.523 "driver_specific": {} 00:10:53.523 } 00:10:53.523 ] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 2694083 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 2694083 00:10:53.523 22:39:38 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:53.523 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:10:53.524 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:53.524 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:10:53.524 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:53.524 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 2694083 00:10:53.524 Running I/O for 5 seconds... 00:10:53.524 task offset: 156280 on job bdev=EE_Dev_1 fails 00:10:53.524 00:10:53.524 Latency(us) 00:10:53.524 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:53.524 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:53.524 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:53.524 EE_Dev_1 : 0.00 23579.85 92.11 5359.06 0.00 461.00 164.73 819.20 00:10:53.524 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:53.524 Dev_2 : 0.00 14375.56 56.15 0.00 0.00 831.14 170.96 1538.67 00:10:53.524 =================================================================================================================== 00:10:53.524 Total : 37955.41 148.26 5359.06 0.00 661.75 164.73 1538.67 00:10:53.524 [2024-07-15 22:39:38.406025] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:53.524 request: 00:10:53.524 { 00:10:53.524 "method": "perform_tests", 00:10:53.524 "req_id": 1 00:10:53.524 } 00:10:53.524 Got JSON-RPC error response 00:10:53.524 response: 00:10:53.524 { 00:10:53.524 "code": -32603, 00:10:53.524 "message": "bdevperf failed with error Operation not permitted" 00:10:53.524 } 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:54.092 00:10:54.092 real 0m9.232s 00:10:54.092 user 0m9.505s 00:10:54.092 sys 0m1.007s 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:54.092 22:39:38 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:54.093 ************************************ 00:10:54.093 END TEST bdev_error 00:10:54.093 ************************************ 00:10:54.093 22:39:38 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:54.093 22:39:38 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:10:54.093 22:39:38 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:54.093 22:39:38 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:54.093 22:39:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:54.093 ************************************ 00:10:54.093 START TEST bdev_stat 00:10:54.093 ************************************ 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=2694298 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 2694298' 00:10:54.093 Process Bdev IO statistics testing pid: 2694298 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 2694298 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 2694298 ']' 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:54.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:54.093 22:39:38 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:54.093 [2024-07-15 22:39:38.897007] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:54.093 [2024-07-15 22:39:38.897076] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694298 ] 00:10:54.351 [2024-07-15 22:39:39.027838] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:54.351 [2024-07-15 22:39:39.134843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:54.351 [2024-07-15 22:39:39.134848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:55.284 Malloc_STAT 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:55.284 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:55.284 [ 00:10:55.284 { 00:10:55.284 "name": "Malloc_STAT", 00:10:55.284 "aliases": [ 00:10:55.284 "9f413e65-d1cf-4eff-af70-9d883c219568" 00:10:55.284 ], 00:10:55.284 "product_name": "Malloc disk", 00:10:55.284 "block_size": 512, 00:10:55.284 "num_blocks": 262144, 00:10:55.284 "uuid": "9f413e65-d1cf-4eff-af70-9d883c219568", 00:10:55.284 "assigned_rate_limits": { 00:10:55.284 "rw_ios_per_sec": 0, 00:10:55.284 "rw_mbytes_per_sec": 0, 00:10:55.284 "r_mbytes_per_sec": 0, 00:10:55.284 "w_mbytes_per_sec": 0 00:10:55.284 }, 00:10:55.284 "claimed": false, 00:10:55.284 "zoned": false, 00:10:55.284 "supported_io_types": { 00:10:55.284 "read": true, 00:10:55.284 "write": true, 00:10:55.284 "unmap": true, 00:10:55.284 "flush": true, 00:10:55.284 "reset": true, 00:10:55.284 "nvme_admin": false, 00:10:55.284 "nvme_io": false, 00:10:55.284 "nvme_io_md": false, 00:10:55.284 "write_zeroes": true, 00:10:55.284 "zcopy": true, 00:10:55.284 "get_zone_info": false, 00:10:55.284 "zone_management": false, 00:10:55.284 "zone_append": false, 00:10:55.284 "compare": false, 00:10:55.284 "compare_and_write": false, 00:10:55.284 "abort": true, 00:10:55.284 "seek_hole": false, 00:10:55.284 "seek_data": false, 00:10:55.284 "copy": true, 00:10:55.284 "nvme_iov_md": false 00:10:55.284 }, 00:10:55.284 "memory_domains": [ 00:10:55.284 { 00:10:55.284 "dma_device_id": "system", 00:10:55.284 "dma_device_type": 1 00:10:55.284 }, 00:10:55.284 { 00:10:55.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:55.285 "dma_device_type": 2 00:10:55.285 } 00:10:55.285 ], 00:10:55.285 "driver_specific": {} 00:10:55.285 } 00:10:55.285 ] 00:10:55.285 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:55.285 22:39:39 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:10:55.285 22:39:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:10:55.285 22:39:39 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:55.285 Running I/O for 10 seconds... 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:10:57.233 "tick_rate": 2300000000, 00:10:57.233 "ticks": 5435460206449764, 00:10:57.233 "bdevs": [ 00:10:57.233 { 00:10:57.233 "name": "Malloc_STAT", 00:10:57.233 "bytes_read": 696300032, 00:10:57.233 "num_read_ops": 169988, 00:10:57.233 "bytes_written": 0, 00:10:57.233 "num_write_ops": 0, 00:10:57.233 "bytes_unmapped": 0, 00:10:57.233 "num_unmap_ops": 0, 00:10:57.233 "bytes_copied": 0, 00:10:57.233 "num_copy_ops": 0, 00:10:57.233 "read_latency_ticks": 2232225793248, 00:10:57.233 "max_read_latency_ticks": 17531644, 00:10:57.233 "min_read_latency_ticks": 263488, 00:10:57.233 "write_latency_ticks": 0, 00:10:57.233 "max_write_latency_ticks": 0, 00:10:57.233 "min_write_latency_ticks": 0, 00:10:57.233 "unmap_latency_ticks": 0, 00:10:57.233 "max_unmap_latency_ticks": 0, 00:10:57.233 "min_unmap_latency_ticks": 0, 00:10:57.233 "copy_latency_ticks": 0, 00:10:57.233 "max_copy_latency_ticks": 0, 00:10:57.233 "min_copy_latency_ticks": 0, 00:10:57.233 "io_error": {} 00:10:57.233 } 00:10:57.233 ] 00:10:57.233 }' 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=169988 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:57.233 22:39:41 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.233 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:57.233 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:10:57.233 "tick_rate": 2300000000, 00:10:57.233 "ticks": 5435460375478332, 00:10:57.233 "name": "Malloc_STAT", 00:10:57.233 "channels": [ 00:10:57.233 { 00:10:57.233 "thread_id": 2, 00:10:57.233 "bytes_read": 397410304, 00:10:57.233 "num_read_ops": 97024, 00:10:57.233 "bytes_written": 0, 00:10:57.233 "num_write_ops": 0, 00:10:57.233 "bytes_unmapped": 0, 00:10:57.233 "num_unmap_ops": 0, 00:10:57.233 "bytes_copied": 0, 00:10:57.233 "num_copy_ops": 0, 00:10:57.233 "read_latency_ticks": 1158458354994, 00:10:57.233 "max_read_latency_ticks": 12743280, 00:10:57.233 "min_read_latency_ticks": 8398938, 00:10:57.233 "write_latency_ticks": 0, 00:10:57.233 "max_write_latency_ticks": 0, 00:10:57.233 "min_write_latency_ticks": 0, 00:10:57.233 "unmap_latency_ticks": 0, 00:10:57.233 "max_unmap_latency_ticks": 0, 00:10:57.233 "min_unmap_latency_ticks": 0, 00:10:57.233 "copy_latency_ticks": 0, 00:10:57.233 "max_copy_latency_ticks": 0, 00:10:57.233 "min_copy_latency_ticks": 0 00:10:57.233 }, 00:10:57.233 { 00:10:57.233 "thread_id": 3, 00:10:57.233 "bytes_read": 326107136, 00:10:57.233 "num_read_ops": 79616, 00:10:57.233 "bytes_written": 0, 00:10:57.233 "num_write_ops": 0, 00:10:57.233 "bytes_unmapped": 0, 00:10:57.233 "num_unmap_ops": 0, 00:10:57.233 "bytes_copied": 0, 00:10:57.233 "num_copy_ops": 0, 00:10:57.233 "read_latency_ticks": 1161529262528, 00:10:57.233 "max_read_latency_ticks": 17531644, 00:10:57.233 "min_read_latency_ticks": 9601406, 00:10:57.233 "write_latency_ticks": 0, 00:10:57.233 "max_write_latency_ticks": 0, 00:10:57.233 "min_write_latency_ticks": 0, 00:10:57.234 "unmap_latency_ticks": 0, 00:10:57.234 "max_unmap_latency_ticks": 0, 00:10:57.234 "min_unmap_latency_ticks": 0, 00:10:57.234 "copy_latency_ticks": 0, 00:10:57.234 "max_copy_latency_ticks": 0, 00:10:57.234 "min_copy_latency_ticks": 0 00:10:57.234 } 00:10:57.234 ] 00:10:57.234 }' 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=97024 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=97024 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=79616 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=176640 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:10:57.234 "tick_rate": 2300000000, 00:10:57.234 "ticks": 5435460652997938, 00:10:57.234 "bdevs": [ 00:10:57.234 { 00:10:57.234 "name": "Malloc_STAT", 00:10:57.234 "bytes_read": 767603200, 00:10:57.234 "num_read_ops": 187396, 00:10:57.234 "bytes_written": 0, 00:10:57.234 "num_write_ops": 0, 00:10:57.234 "bytes_unmapped": 0, 00:10:57.234 "num_unmap_ops": 0, 00:10:57.234 "bytes_copied": 0, 00:10:57.234 "num_copy_ops": 0, 00:10:57.234 "read_latency_ticks": 2461655408350, 00:10:57.234 "max_read_latency_ticks": 17531644, 00:10:57.234 "min_read_latency_ticks": 263488, 00:10:57.234 "write_latency_ticks": 0, 00:10:57.234 "max_write_latency_ticks": 0, 00:10:57.234 "min_write_latency_ticks": 0, 00:10:57.234 "unmap_latency_ticks": 0, 00:10:57.234 "max_unmap_latency_ticks": 0, 00:10:57.234 "min_unmap_latency_ticks": 0, 00:10:57.234 "copy_latency_ticks": 0, 00:10:57.234 "max_copy_latency_ticks": 0, 00:10:57.234 "min_copy_latency_ticks": 0, 00:10:57.234 "io_error": {} 00:10:57.234 } 00:10:57.234 ] 00:10:57.234 }' 00:10:57.234 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=187396 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 176640 -lt 169988 ']' 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 176640 -gt 187396 ']' 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.493 00:10:57.493 Latency(us) 00:10:57.493 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:57.493 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:57.493 Malloc_STAT : 2.17 49203.63 192.20 0.00 0.00 5190.62 1410.45 5556.31 00:10:57.493 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:57.493 Malloc_STAT : 2.18 40326.37 157.52 0.00 0.00 6332.25 1246.61 7636.37 00:10:57.493 =================================================================================================================== 00:10:57.493 Total : 89529.99 349.73 0.00 0.00 5705.18 1246.61 7636.37 00:10:57.493 0 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 2694298 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 2694298 ']' 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 2694298 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2694298 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2694298' 00:10:57.493 killing process with pid 2694298 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 2694298 00:10:57.493 Received shutdown signal, test time was about 2.256706 seconds 00:10:57.493 00:10:57.493 Latency(us) 00:10:57.493 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:57.493 =================================================================================================================== 00:10:57.493 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:57.493 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 2694298 00:10:57.752 22:39:42 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:10:57.752 00:10:57.752 real 0m3.668s 00:10:57.752 user 0m7.316s 00:10:57.752 sys 0m0.485s 00:10:57.752 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:57.752 22:39:42 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:57.752 ************************************ 00:10:57.752 END TEST bdev_stat 00:10:57.752 ************************************ 00:10:57.752 22:39:42 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:57.752 22:39:42 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:57.752 00:10:57.752 real 1m58.120s 00:10:57.752 user 7m12.588s 00:10:57.752 sys 0m24.175s 00:10:57.752 22:39:42 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:57.752 22:39:42 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:57.752 ************************************ 00:10:57.752 END TEST blockdev_general 00:10:57.752 ************************************ 00:10:57.752 22:39:42 -- common/autotest_common.sh@1142 -- # return 0 00:10:57.752 22:39:42 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:57.752 22:39:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:57.752 22:39:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:57.752 22:39:42 -- common/autotest_common.sh@10 -- # set +x 00:10:57.752 ************************************ 00:10:57.752 START TEST bdev_raid 00:10:57.752 ************************************ 00:10:57.752 22:39:42 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:58.011 * Looking for test storage... 00:10:58.011 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:58.011 22:39:42 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:58.011 22:39:42 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:10:58.012 22:39:42 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:10:58.012 22:39:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:58.012 22:39:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:58.012 22:39:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:58.012 ************************************ 00:10:58.012 START TEST raid_function_test_raid0 00:10:58.012 ************************************ 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2694912 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2694912' 00:10:58.012 Process raid pid: 2694912 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2694912 /var/tmp/spdk-raid.sock 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 2694912 ']' 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:58.012 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:58.012 22:39:42 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:10:58.012 [2024-07-15 22:39:42.873495] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:10:58.012 [2024-07-15 22:39:42.873567] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:58.271 [2024-07-15 22:39:42.993745] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.271 [2024-07-15 22:39:43.097767] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.271 [2024-07-15 22:39:43.158236] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.271 [2024-07-15 22:39:43.158268] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:59.209 22:39:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:59.209 22:39:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:10:59.209 22:39:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:10:59.209 22:39:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:10:59.209 22:39:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:59.209 22:39:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:10:59.209 22:39:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:10:59.209 [2024-07-15 22:39:44.077893] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:10:59.209 [2024-07-15 22:39:44.079388] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:10:59.209 [2024-07-15 22:39:44.079447] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x195ebd0 00:10:59.209 [2024-07-15 22:39:44.079458] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:59.209 [2024-07-15 22:39:44.079642] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x195eb10 00:10:59.209 [2024-07-15 22:39:44.079762] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x195ebd0 00:10:59.209 [2024-07-15 22:39:44.079772] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x195ebd0 00:10:59.209 [2024-07-15 22:39:44.079871] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:59.209 Base_1 00:10:59.209 Base_2 00:10:59.209 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:10:59.209 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:59.209 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:59.469 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:10:59.728 [2024-07-15 22:39:44.579239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b128e0 00:10:59.728 /dev/nbd0 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:59.728 1+0 records in 00:10:59.728 1+0 records out 00:10:59.728 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252963 s, 16.2 MB/s 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:10:59.728 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:10:59.987 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:59.987 { 00:10:59.987 "nbd_device": "/dev/nbd0", 00:10:59.987 "bdev_name": "raid" 00:10:59.988 } 00:10:59.988 ]' 00:10:59.988 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:59.988 { 00:10:59.988 "nbd_device": "/dev/nbd0", 00:10:59.988 "bdev_name": "raid" 00:10:59.988 } 00:10:59.988 ]' 00:10:59.988 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:00.247 4096+0 records in 00:11:00.247 4096+0 records out 00:11:00.247 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0301199 s, 69.6 MB/s 00:11:00.247 22:39:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:00.506 4096+0 records in 00:11:00.506 4096+0 records out 00:11:00.506 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.325104 s, 6.5 MB/s 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:00.507 128+0 records in 00:11:00.507 128+0 records out 00:11:00.507 65536 bytes (66 kB, 64 KiB) copied, 0.000837208 s, 78.3 MB/s 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:00.507 2035+0 records in 00:11:00.507 2035+0 records out 00:11:00.507 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0117197 s, 88.9 MB/s 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:00.507 456+0 records in 00:11:00.507 456+0 records out 00:11:00.507 233472 bytes (233 kB, 228 KiB) copied, 0.00275849 s, 84.6 MB/s 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:00.507 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:00.767 [2024-07-15 22:39:45.668030] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:01.027 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2694912 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 2694912 ']' 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 2694912 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2694912 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2694912' 00:11:01.287 killing process with pid 2694912 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 2694912 00:11:01.287 [2024-07-15 22:39:45.990684] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:01.287 [2024-07-15 22:39:45.990751] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:01.287 [2024-07-15 22:39:45.990795] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:01.287 [2024-07-15 22:39:45.990810] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x195ebd0 name raid, state offline 00:11:01.287 22:39:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 2694912 00:11:01.287 [2024-07-15 22:39:46.008767] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:01.547 22:39:46 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:01.547 00:11:01.547 real 0m3.408s 00:11:01.547 user 0m4.430s 00:11:01.547 sys 0m1.280s 00:11:01.547 22:39:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:01.547 22:39:46 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:01.547 ************************************ 00:11:01.547 END TEST raid_function_test_raid0 00:11:01.547 ************************************ 00:11:01.547 22:39:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:01.547 22:39:46 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:11:01.547 22:39:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:01.547 22:39:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:01.547 22:39:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:01.547 ************************************ 00:11:01.547 START TEST raid_function_test_concat 00:11:01.547 ************************************ 00:11:01.547 22:39:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:11:01.547 22:39:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:01.547 22:39:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2695361 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2695361' 00:11:01.548 Process raid pid: 2695361 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2695361 /var/tmp/spdk-raid.sock 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 2695361 ']' 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:01.548 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:01.548 22:39:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:01.548 [2024-07-15 22:39:46.381178] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:01.548 [2024-07-15 22:39:46.381251] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:01.806 [2024-07-15 22:39:46.514351] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:01.806 [2024-07-15 22:39:46.616314] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:01.806 [2024-07-15 22:39:46.683291] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:01.806 [2024-07-15 22:39:46.683328] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:02.739 [2024-07-15 22:39:47.576865] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:02.739 [2024-07-15 22:39:47.578351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:02.739 [2024-07-15 22:39:47.578413] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f9bd0 00:11:02.739 [2024-07-15 22:39:47.578424] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:02.739 [2024-07-15 22:39:47.578613] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f9b10 00:11:02.739 [2024-07-15 22:39:47.578735] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f9bd0 00:11:02.739 [2024-07-15 22:39:47.578746] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x17f9bd0 00:11:02.739 [2024-07-15 22:39:47.578846] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:02.739 Base_1 00:11:02.739 Base_2 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:02.739 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:02.998 22:39:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:03.566 [2024-07-15 22:39:48.334905] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19ad8e0 00:11:03.566 /dev/nbd0 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:03.566 1+0 records in 00:11:03.566 1+0 records out 00:11:03.566 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266385 s, 15.4 MB/s 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:03.566 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:03.567 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:03.567 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:03.825 { 00:11:03.825 "nbd_device": "/dev/nbd0", 00:11:03.825 "bdev_name": "raid" 00:11:03.825 } 00:11:03.825 ]' 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:03.825 { 00:11:03.825 "nbd_device": "/dev/nbd0", 00:11:03.825 "bdev_name": "raid" 00:11:03.825 } 00:11:03.825 ]' 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:03.825 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:03.826 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:03.826 4096+0 records in 00:11:03.826 4096+0 records out 00:11:03.826 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0312171 s, 67.2 MB/s 00:11:03.826 22:39:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:04.392 4096+0 records in 00:11:04.392 4096+0 records out 00:11:04.392 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.323696 s, 6.5 MB/s 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:04.392 128+0 records in 00:11:04.392 128+0 records out 00:11:04.392 65536 bytes (66 kB, 64 KiB) copied, 0.000836049 s, 78.4 MB/s 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:04.392 2035+0 records in 00:11:04.392 2035+0 records out 00:11:04.392 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0118773 s, 87.7 MB/s 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:04.392 456+0 records in 00:11:04.392 456+0 records out 00:11:04.392 233472 bytes (233 kB, 228 KiB) copied, 0.0027265 s, 85.6 MB/s 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:04.392 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:04.650 [2024-07-15 22:39:49.391781] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:04.650 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2695361 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 2695361 ']' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 2695361 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2695361 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2695361' 00:11:04.909 killing process with pid 2695361 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 2695361 00:11:04.909 [2024-07-15 22:39:49.772067] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:04.909 [2024-07-15 22:39:49.772135] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:04.909 [2024-07-15 22:39:49.772178] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:04.909 [2024-07-15 22:39:49.772193] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f9bd0 name raid, state offline 00:11:04.909 22:39:49 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 2695361 00:11:04.909 [2024-07-15 22:39:49.789292] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:05.167 22:39:50 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:05.167 00:11:05.167 real 0m3.694s 00:11:05.167 user 0m4.977s 00:11:05.167 sys 0m1.314s 00:11:05.167 22:39:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:05.167 22:39:50 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:05.167 ************************************ 00:11:05.167 END TEST raid_function_test_concat 00:11:05.167 ************************************ 00:11:05.167 22:39:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:05.167 22:39:50 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:11:05.167 22:39:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:05.167 22:39:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:05.167 22:39:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:05.424 ************************************ 00:11:05.424 START TEST raid0_resize_test 00:11:05.424 ************************************ 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2695961 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2695961' 00:11:05.424 Process raid pid: 2695961 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2695961 /var/tmp/spdk-raid.sock 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 2695961 ']' 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:05.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:05.424 22:39:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:05.424 [2024-07-15 22:39:50.161630] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:05.424 [2024-07-15 22:39:50.161700] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:05.424 [2024-07-15 22:39:50.292419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:05.739 [2024-07-15 22:39:50.397870] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:05.739 [2024-07-15 22:39:50.457712] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:05.739 [2024-07-15 22:39:50.457742] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:06.313 22:39:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:06.313 22:39:51 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:11:06.313 22:39:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:06.571 Base_1 00:11:06.571 22:39:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:06.828 Base_2 00:11:06.828 22:39:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:07.086 [2024-07-15 22:39:51.819469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:07.086 [2024-07-15 22:39:51.820739] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:07.086 [2024-07-15 22:39:51.820790] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x137f780 00:11:07.086 [2024-07-15 22:39:51.820800] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:07.087 [2024-07-15 22:39:51.821000] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xecb020 00:11:07.087 [2024-07-15 22:39:51.821089] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x137f780 00:11:07.087 [2024-07-15 22:39:51.821099] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x137f780 00:11:07.087 [2024-07-15 22:39:51.821195] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:07.087 22:39:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:07.345 [2024-07-15 22:39:52.064111] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:07.345 [2024-07-15 22:39:52.064131] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:07.345 true 00:11:07.345 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:07.345 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:11:07.345 [2024-07-15 22:39:52.244734] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:07.603 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:11:07.603 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:11:07.603 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:11:07.603 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:07.862 [2024-07-15 22:39:52.753912] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:07.862 [2024-07-15 22:39:52.753943] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:07.862 [2024-07-15 22:39:52.753970] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:07.862 true 00:11:08.120 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:08.120 22:39:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:11:08.120 [2024-07-15 22:39:53.010757] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2695961 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 2695961 ']' 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 2695961 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2695961 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2695961' 00:11:08.379 killing process with pid 2695961 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 2695961 00:11:08.379 [2024-07-15 22:39:53.080858] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:08.379 [2024-07-15 22:39:53.080915] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:08.379 [2024-07-15 22:39:53.080963] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:08.379 [2024-07-15 22:39:53.080976] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x137f780 name Raid, state offline 00:11:08.379 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 2695961 00:11:08.379 [2024-07-15 22:39:53.082354] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:08.638 22:39:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:11:08.638 00:11:08.638 real 0m3.194s 00:11:08.638 user 0m4.971s 00:11:08.638 sys 0m0.683s 00:11:08.638 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:08.638 22:39:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.638 ************************************ 00:11:08.638 END TEST raid0_resize_test 00:11:08.638 ************************************ 00:11:08.638 22:39:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:08.638 22:39:53 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:08.638 22:39:53 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:08.638 22:39:53 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:08.638 22:39:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:08.638 22:39:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:08.638 22:39:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:08.638 ************************************ 00:11:08.638 START TEST raid_state_function_test 00:11:08.638 ************************************ 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2696493 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2696493' 00:11:08.638 Process raid pid: 2696493 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2696493 /var/tmp/spdk-raid.sock 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2696493 ']' 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:08.638 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:08.638 22:39:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.638 [2024-07-15 22:39:53.448516] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:08.638 [2024-07-15 22:39:53.448586] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:08.897 [2024-07-15 22:39:53.583100] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:08.897 [2024-07-15 22:39:53.685057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:08.897 [2024-07-15 22:39:53.744990] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.897 [2024-07-15 22:39:53.745026] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:09.832 [2024-07-15 22:39:54.606467] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:09.832 [2024-07-15 22:39:54.606512] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:09.832 [2024-07-15 22:39:54.606523] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:09.832 [2024-07-15 22:39:54.606535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.832 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.090 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:10.090 "name": "Existed_Raid", 00:11:10.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.090 "strip_size_kb": 64, 00:11:10.090 "state": "configuring", 00:11:10.090 "raid_level": "raid0", 00:11:10.090 "superblock": false, 00:11:10.090 "num_base_bdevs": 2, 00:11:10.090 "num_base_bdevs_discovered": 0, 00:11:10.090 "num_base_bdevs_operational": 2, 00:11:10.090 "base_bdevs_list": [ 00:11:10.090 { 00:11:10.090 "name": "BaseBdev1", 00:11:10.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.090 "is_configured": false, 00:11:10.090 "data_offset": 0, 00:11:10.090 "data_size": 0 00:11:10.090 }, 00:11:10.090 { 00:11:10.090 "name": "BaseBdev2", 00:11:10.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:10.090 "is_configured": false, 00:11:10.090 "data_offset": 0, 00:11:10.090 "data_size": 0 00:11:10.090 } 00:11:10.090 ] 00:11:10.090 }' 00:11:10.090 22:39:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:10.090 22:39:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:10.657 22:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:10.915 [2024-07-15 22:39:55.693199] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:10.915 [2024-07-15 22:39:55.693228] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2021a80 name Existed_Raid, state configuring 00:11:10.915 22:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:11.172 [2024-07-15 22:39:55.933853] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:11.172 [2024-07-15 22:39:55.933882] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:11.172 [2024-07-15 22:39:55.933897] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:11.172 [2024-07-15 22:39:55.933909] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:11.172 22:39:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:11.431 [2024-07-15 22:39:56.196483] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:11.431 BaseBdev1 00:11:11.431 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:11.431 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:11.431 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:11.431 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:11.431 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:11.431 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:11.431 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:11.689 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:11.947 [ 00:11:11.947 { 00:11:11.947 "name": "BaseBdev1", 00:11:11.947 "aliases": [ 00:11:11.947 "bb7b0795-c414-41ad-a8c8-d8a996d14e71" 00:11:11.947 ], 00:11:11.947 "product_name": "Malloc disk", 00:11:11.947 "block_size": 512, 00:11:11.947 "num_blocks": 65536, 00:11:11.947 "uuid": "bb7b0795-c414-41ad-a8c8-d8a996d14e71", 00:11:11.947 "assigned_rate_limits": { 00:11:11.947 "rw_ios_per_sec": 0, 00:11:11.947 "rw_mbytes_per_sec": 0, 00:11:11.947 "r_mbytes_per_sec": 0, 00:11:11.947 "w_mbytes_per_sec": 0 00:11:11.947 }, 00:11:11.947 "claimed": true, 00:11:11.947 "claim_type": "exclusive_write", 00:11:11.947 "zoned": false, 00:11:11.947 "supported_io_types": { 00:11:11.947 "read": true, 00:11:11.947 "write": true, 00:11:11.947 "unmap": true, 00:11:11.947 "flush": true, 00:11:11.947 "reset": true, 00:11:11.947 "nvme_admin": false, 00:11:11.947 "nvme_io": false, 00:11:11.947 "nvme_io_md": false, 00:11:11.947 "write_zeroes": true, 00:11:11.947 "zcopy": true, 00:11:11.947 "get_zone_info": false, 00:11:11.947 "zone_management": false, 00:11:11.947 "zone_append": false, 00:11:11.947 "compare": false, 00:11:11.947 "compare_and_write": false, 00:11:11.947 "abort": true, 00:11:11.947 "seek_hole": false, 00:11:11.947 "seek_data": false, 00:11:11.947 "copy": true, 00:11:11.947 "nvme_iov_md": false 00:11:11.947 }, 00:11:11.947 "memory_domains": [ 00:11:11.947 { 00:11:11.947 "dma_device_id": "system", 00:11:11.947 "dma_device_type": 1 00:11:11.947 }, 00:11:11.947 { 00:11:11.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.947 "dma_device_type": 2 00:11:11.947 } 00:11:11.947 ], 00:11:11.947 "driver_specific": {} 00:11:11.947 } 00:11:11.947 ] 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:11.947 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.948 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:12.205 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:12.205 "name": "Existed_Raid", 00:11:12.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:12.205 "strip_size_kb": 64, 00:11:12.205 "state": "configuring", 00:11:12.205 "raid_level": "raid0", 00:11:12.205 "superblock": false, 00:11:12.206 "num_base_bdevs": 2, 00:11:12.206 "num_base_bdevs_discovered": 1, 00:11:12.206 "num_base_bdevs_operational": 2, 00:11:12.206 "base_bdevs_list": [ 00:11:12.206 { 00:11:12.206 "name": "BaseBdev1", 00:11:12.206 "uuid": "bb7b0795-c414-41ad-a8c8-d8a996d14e71", 00:11:12.206 "is_configured": true, 00:11:12.206 "data_offset": 0, 00:11:12.206 "data_size": 65536 00:11:12.206 }, 00:11:12.206 { 00:11:12.206 "name": "BaseBdev2", 00:11:12.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:12.206 "is_configured": false, 00:11:12.206 "data_offset": 0, 00:11:12.206 "data_size": 0 00:11:12.206 } 00:11:12.206 ] 00:11:12.206 }' 00:11:12.206 22:39:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:12.206 22:39:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.771 22:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:13.029 [2024-07-15 22:39:57.784965] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:13.029 [2024-07-15 22:39:57.785005] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2021350 name Existed_Raid, state configuring 00:11:13.029 22:39:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:13.288 [2024-07-15 22:39:58.033635] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:13.288 [2024-07-15 22:39:58.035134] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:13.288 [2024-07-15 22:39:58.035170] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.288 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:13.547 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.547 "name": "Existed_Raid", 00:11:13.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.547 "strip_size_kb": 64, 00:11:13.547 "state": "configuring", 00:11:13.547 "raid_level": "raid0", 00:11:13.547 "superblock": false, 00:11:13.547 "num_base_bdevs": 2, 00:11:13.547 "num_base_bdevs_discovered": 1, 00:11:13.547 "num_base_bdevs_operational": 2, 00:11:13.547 "base_bdevs_list": [ 00:11:13.547 { 00:11:13.547 "name": "BaseBdev1", 00:11:13.547 "uuid": "bb7b0795-c414-41ad-a8c8-d8a996d14e71", 00:11:13.547 "is_configured": true, 00:11:13.547 "data_offset": 0, 00:11:13.547 "data_size": 65536 00:11:13.547 }, 00:11:13.547 { 00:11:13.547 "name": "BaseBdev2", 00:11:13.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.547 "is_configured": false, 00:11:13.547 "data_offset": 0, 00:11:13.547 "data_size": 0 00:11:13.547 } 00:11:13.547 ] 00:11:13.547 }' 00:11:13.547 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.547 22:39:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.113 22:39:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:14.371 [2024-07-15 22:39:59.109110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:14.371 [2024-07-15 22:39:59.109153] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2022000 00:11:14.371 [2024-07-15 22:39:59.109161] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:14.371 [2024-07-15 22:39:59.109358] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f3c0c0 00:11:14.371 [2024-07-15 22:39:59.109485] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2022000 00:11:14.371 [2024-07-15 22:39:59.109495] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2022000 00:11:14.371 [2024-07-15 22:39:59.109670] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:14.371 BaseBdev2 00:11:14.371 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:14.371 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:14.371 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:14.371 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:14.371 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:14.371 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:14.371 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:14.630 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:14.888 [ 00:11:14.888 { 00:11:14.888 "name": "BaseBdev2", 00:11:14.888 "aliases": [ 00:11:14.888 "f44a2a60-670a-410a-83f5-0be753e4d044" 00:11:14.888 ], 00:11:14.888 "product_name": "Malloc disk", 00:11:14.888 "block_size": 512, 00:11:14.888 "num_blocks": 65536, 00:11:14.888 "uuid": "f44a2a60-670a-410a-83f5-0be753e4d044", 00:11:14.888 "assigned_rate_limits": { 00:11:14.888 "rw_ios_per_sec": 0, 00:11:14.888 "rw_mbytes_per_sec": 0, 00:11:14.888 "r_mbytes_per_sec": 0, 00:11:14.888 "w_mbytes_per_sec": 0 00:11:14.888 }, 00:11:14.888 "claimed": true, 00:11:14.888 "claim_type": "exclusive_write", 00:11:14.888 "zoned": false, 00:11:14.888 "supported_io_types": { 00:11:14.888 "read": true, 00:11:14.888 "write": true, 00:11:14.888 "unmap": true, 00:11:14.888 "flush": true, 00:11:14.888 "reset": true, 00:11:14.888 "nvme_admin": false, 00:11:14.888 "nvme_io": false, 00:11:14.888 "nvme_io_md": false, 00:11:14.888 "write_zeroes": true, 00:11:14.888 "zcopy": true, 00:11:14.888 "get_zone_info": false, 00:11:14.888 "zone_management": false, 00:11:14.888 "zone_append": false, 00:11:14.888 "compare": false, 00:11:14.888 "compare_and_write": false, 00:11:14.888 "abort": true, 00:11:14.888 "seek_hole": false, 00:11:14.888 "seek_data": false, 00:11:14.888 "copy": true, 00:11:14.888 "nvme_iov_md": false 00:11:14.888 }, 00:11:14.888 "memory_domains": [ 00:11:14.888 { 00:11:14.888 "dma_device_id": "system", 00:11:14.888 "dma_device_type": 1 00:11:14.888 }, 00:11:14.888 { 00:11:14.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:14.888 "dma_device_type": 2 00:11:14.888 } 00:11:14.888 ], 00:11:14.888 "driver_specific": {} 00:11:14.888 } 00:11:14.888 ] 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:14.888 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:14.889 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:14.889 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:14.889 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:14.889 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:14.889 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.889 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:15.147 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.147 "name": "Existed_Raid", 00:11:15.147 "uuid": "9990f2ba-4a0f-4501-9faa-a10f318ece3e", 00:11:15.147 "strip_size_kb": 64, 00:11:15.147 "state": "online", 00:11:15.147 "raid_level": "raid0", 00:11:15.147 "superblock": false, 00:11:15.147 "num_base_bdevs": 2, 00:11:15.147 "num_base_bdevs_discovered": 2, 00:11:15.147 "num_base_bdevs_operational": 2, 00:11:15.147 "base_bdevs_list": [ 00:11:15.147 { 00:11:15.147 "name": "BaseBdev1", 00:11:15.147 "uuid": "bb7b0795-c414-41ad-a8c8-d8a996d14e71", 00:11:15.147 "is_configured": true, 00:11:15.147 "data_offset": 0, 00:11:15.147 "data_size": 65536 00:11:15.147 }, 00:11:15.147 { 00:11:15.147 "name": "BaseBdev2", 00:11:15.147 "uuid": "f44a2a60-670a-410a-83f5-0be753e4d044", 00:11:15.147 "is_configured": true, 00:11:15.147 "data_offset": 0, 00:11:15.147 "data_size": 65536 00:11:15.147 } 00:11:15.147 ] 00:11:15.147 }' 00:11:15.147 22:39:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.147 22:39:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:15.713 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:15.972 [2024-07-15 22:40:00.705624] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:15.972 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:15.972 "name": "Existed_Raid", 00:11:15.972 "aliases": [ 00:11:15.972 "9990f2ba-4a0f-4501-9faa-a10f318ece3e" 00:11:15.972 ], 00:11:15.972 "product_name": "Raid Volume", 00:11:15.972 "block_size": 512, 00:11:15.972 "num_blocks": 131072, 00:11:15.972 "uuid": "9990f2ba-4a0f-4501-9faa-a10f318ece3e", 00:11:15.972 "assigned_rate_limits": { 00:11:15.972 "rw_ios_per_sec": 0, 00:11:15.972 "rw_mbytes_per_sec": 0, 00:11:15.972 "r_mbytes_per_sec": 0, 00:11:15.972 "w_mbytes_per_sec": 0 00:11:15.972 }, 00:11:15.972 "claimed": false, 00:11:15.972 "zoned": false, 00:11:15.972 "supported_io_types": { 00:11:15.972 "read": true, 00:11:15.972 "write": true, 00:11:15.972 "unmap": true, 00:11:15.972 "flush": true, 00:11:15.972 "reset": true, 00:11:15.972 "nvme_admin": false, 00:11:15.972 "nvme_io": false, 00:11:15.972 "nvme_io_md": false, 00:11:15.972 "write_zeroes": true, 00:11:15.972 "zcopy": false, 00:11:15.972 "get_zone_info": false, 00:11:15.972 "zone_management": false, 00:11:15.972 "zone_append": false, 00:11:15.972 "compare": false, 00:11:15.972 "compare_and_write": false, 00:11:15.972 "abort": false, 00:11:15.972 "seek_hole": false, 00:11:15.972 "seek_data": false, 00:11:15.972 "copy": false, 00:11:15.972 "nvme_iov_md": false 00:11:15.972 }, 00:11:15.972 "memory_domains": [ 00:11:15.972 { 00:11:15.972 "dma_device_id": "system", 00:11:15.972 "dma_device_type": 1 00:11:15.972 }, 00:11:15.972 { 00:11:15.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.973 "dma_device_type": 2 00:11:15.973 }, 00:11:15.973 { 00:11:15.973 "dma_device_id": "system", 00:11:15.973 "dma_device_type": 1 00:11:15.973 }, 00:11:15.973 { 00:11:15.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:15.973 "dma_device_type": 2 00:11:15.973 } 00:11:15.973 ], 00:11:15.973 "driver_specific": { 00:11:15.973 "raid": { 00:11:15.973 "uuid": "9990f2ba-4a0f-4501-9faa-a10f318ece3e", 00:11:15.973 "strip_size_kb": 64, 00:11:15.973 "state": "online", 00:11:15.973 "raid_level": "raid0", 00:11:15.973 "superblock": false, 00:11:15.973 "num_base_bdevs": 2, 00:11:15.973 "num_base_bdevs_discovered": 2, 00:11:15.973 "num_base_bdevs_operational": 2, 00:11:15.973 "base_bdevs_list": [ 00:11:15.973 { 00:11:15.973 "name": "BaseBdev1", 00:11:15.973 "uuid": "bb7b0795-c414-41ad-a8c8-d8a996d14e71", 00:11:15.973 "is_configured": true, 00:11:15.973 "data_offset": 0, 00:11:15.973 "data_size": 65536 00:11:15.973 }, 00:11:15.973 { 00:11:15.973 "name": "BaseBdev2", 00:11:15.973 "uuid": "f44a2a60-670a-410a-83f5-0be753e4d044", 00:11:15.973 "is_configured": true, 00:11:15.973 "data_offset": 0, 00:11:15.973 "data_size": 65536 00:11:15.973 } 00:11:15.973 ] 00:11:15.973 } 00:11:15.973 } 00:11:15.973 }' 00:11:15.973 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:15.973 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:15.973 BaseBdev2' 00:11:15.973 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:15.973 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:15.973 22:40:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:16.232 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:16.232 "name": "BaseBdev1", 00:11:16.232 "aliases": [ 00:11:16.232 "bb7b0795-c414-41ad-a8c8-d8a996d14e71" 00:11:16.232 ], 00:11:16.232 "product_name": "Malloc disk", 00:11:16.232 "block_size": 512, 00:11:16.232 "num_blocks": 65536, 00:11:16.232 "uuid": "bb7b0795-c414-41ad-a8c8-d8a996d14e71", 00:11:16.232 "assigned_rate_limits": { 00:11:16.232 "rw_ios_per_sec": 0, 00:11:16.232 "rw_mbytes_per_sec": 0, 00:11:16.232 "r_mbytes_per_sec": 0, 00:11:16.232 "w_mbytes_per_sec": 0 00:11:16.232 }, 00:11:16.232 "claimed": true, 00:11:16.232 "claim_type": "exclusive_write", 00:11:16.232 "zoned": false, 00:11:16.232 "supported_io_types": { 00:11:16.232 "read": true, 00:11:16.232 "write": true, 00:11:16.232 "unmap": true, 00:11:16.232 "flush": true, 00:11:16.232 "reset": true, 00:11:16.232 "nvme_admin": false, 00:11:16.232 "nvme_io": false, 00:11:16.232 "nvme_io_md": false, 00:11:16.232 "write_zeroes": true, 00:11:16.232 "zcopy": true, 00:11:16.232 "get_zone_info": false, 00:11:16.232 "zone_management": false, 00:11:16.232 "zone_append": false, 00:11:16.232 "compare": false, 00:11:16.232 "compare_and_write": false, 00:11:16.232 "abort": true, 00:11:16.232 "seek_hole": false, 00:11:16.232 "seek_data": false, 00:11:16.232 "copy": true, 00:11:16.232 "nvme_iov_md": false 00:11:16.232 }, 00:11:16.232 "memory_domains": [ 00:11:16.232 { 00:11:16.232 "dma_device_id": "system", 00:11:16.232 "dma_device_type": 1 00:11:16.232 }, 00:11:16.232 { 00:11:16.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.232 "dma_device_type": 2 00:11:16.232 } 00:11:16.232 ], 00:11:16.232 "driver_specific": {} 00:11:16.232 }' 00:11:16.232 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:16.232 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:16.232 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:16.232 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:16.491 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:16.491 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:16.491 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:16.491 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:16.491 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:16.491 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:16.491 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:16.749 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:16.749 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:16.749 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:16.749 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:17.008 "name": "BaseBdev2", 00:11:17.008 "aliases": [ 00:11:17.008 "f44a2a60-670a-410a-83f5-0be753e4d044" 00:11:17.008 ], 00:11:17.008 "product_name": "Malloc disk", 00:11:17.008 "block_size": 512, 00:11:17.008 "num_blocks": 65536, 00:11:17.008 "uuid": "f44a2a60-670a-410a-83f5-0be753e4d044", 00:11:17.008 "assigned_rate_limits": { 00:11:17.008 "rw_ios_per_sec": 0, 00:11:17.008 "rw_mbytes_per_sec": 0, 00:11:17.008 "r_mbytes_per_sec": 0, 00:11:17.008 "w_mbytes_per_sec": 0 00:11:17.008 }, 00:11:17.008 "claimed": true, 00:11:17.008 "claim_type": "exclusive_write", 00:11:17.008 "zoned": false, 00:11:17.008 "supported_io_types": { 00:11:17.008 "read": true, 00:11:17.008 "write": true, 00:11:17.008 "unmap": true, 00:11:17.008 "flush": true, 00:11:17.008 "reset": true, 00:11:17.008 "nvme_admin": false, 00:11:17.008 "nvme_io": false, 00:11:17.008 "nvme_io_md": false, 00:11:17.008 "write_zeroes": true, 00:11:17.008 "zcopy": true, 00:11:17.008 "get_zone_info": false, 00:11:17.008 "zone_management": false, 00:11:17.008 "zone_append": false, 00:11:17.008 "compare": false, 00:11:17.008 "compare_and_write": false, 00:11:17.008 "abort": true, 00:11:17.008 "seek_hole": false, 00:11:17.008 "seek_data": false, 00:11:17.008 "copy": true, 00:11:17.008 "nvme_iov_md": false 00:11:17.008 }, 00:11:17.008 "memory_domains": [ 00:11:17.008 { 00:11:17.008 "dma_device_id": "system", 00:11:17.008 "dma_device_type": 1 00:11:17.008 }, 00:11:17.008 { 00:11:17.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.008 "dma_device_type": 2 00:11:17.008 } 00:11:17.008 ], 00:11:17.008 "driver_specific": {} 00:11:17.008 }' 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:17.008 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:17.266 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:17.266 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:17.266 22:40:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:17.266 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:17.266 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:17.526 [2024-07-15 22:40:02.249474] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:17.526 [2024-07-15 22:40:02.249499] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:17.526 [2024-07-15 22:40:02.249542] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.526 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.785 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.785 "name": "Existed_Raid", 00:11:17.785 "uuid": "9990f2ba-4a0f-4501-9faa-a10f318ece3e", 00:11:17.785 "strip_size_kb": 64, 00:11:17.785 "state": "offline", 00:11:17.785 "raid_level": "raid0", 00:11:17.785 "superblock": false, 00:11:17.785 "num_base_bdevs": 2, 00:11:17.785 "num_base_bdevs_discovered": 1, 00:11:17.785 "num_base_bdevs_operational": 1, 00:11:17.785 "base_bdevs_list": [ 00:11:17.785 { 00:11:17.785 "name": null, 00:11:17.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.785 "is_configured": false, 00:11:17.785 "data_offset": 0, 00:11:17.785 "data_size": 65536 00:11:17.785 }, 00:11:17.785 { 00:11:17.785 "name": "BaseBdev2", 00:11:17.785 "uuid": "f44a2a60-670a-410a-83f5-0be753e4d044", 00:11:17.785 "is_configured": true, 00:11:17.785 "data_offset": 0, 00:11:17.785 "data_size": 65536 00:11:17.785 } 00:11:17.785 ] 00:11:17.785 }' 00:11:17.785 22:40:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.785 22:40:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.353 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:18.353 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:18.353 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.353 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:18.611 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:18.611 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:18.611 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:18.869 [2024-07-15 22:40:03.598263] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:18.869 [2024-07-15 22:40:03.598319] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2022000 name Existed_Raid, state offline 00:11:18.869 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:18.869 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:18.869 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.869 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2696493 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2696493 ']' 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2696493 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2696493 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2696493' 00:11:19.163 killing process with pid 2696493 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2696493 00:11:19.163 [2024-07-15 22:40:03.927974] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:19.163 22:40:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2696493 00:11:19.163 [2024-07-15 22:40:03.928837] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:19.423 00:11:19.423 real 0m10.753s 00:11:19.423 user 0m19.202s 00:11:19.423 sys 0m1.932s 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.423 ************************************ 00:11:19.423 END TEST raid_state_function_test 00:11:19.423 ************************************ 00:11:19.423 22:40:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:19.423 22:40:04 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:19.423 22:40:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:19.423 22:40:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:19.423 22:40:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:19.423 ************************************ 00:11:19.423 START TEST raid_state_function_test_sb 00:11:19.423 ************************************ 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2698120 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2698120' 00:11:19.423 Process raid pid: 2698120 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2698120 /var/tmp/spdk-raid.sock 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2698120 ']' 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:19.423 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:19.423 22:40:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.423 [2024-07-15 22:40:04.279784] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:19.423 [2024-07-15 22:40:04.279849] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:19.683 [2024-07-15 22:40:04.410348] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:19.683 [2024-07-15 22:40:04.516755] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:19.683 [2024-07-15 22:40:04.585142] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:19.683 [2024-07-15 22:40:04.585179] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:20.323 22:40:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:20.323 22:40:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:20.323 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:20.581 [2024-07-15 22:40:05.376488] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:20.581 [2024-07-15 22:40:05.376527] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:20.581 [2024-07-15 22:40:05.376538] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:20.581 [2024-07-15 22:40:05.376550] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.581 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:20.839 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.839 "name": "Existed_Raid", 00:11:20.839 "uuid": "686f6388-33b7-45cb-8736-74eef780ad28", 00:11:20.839 "strip_size_kb": 64, 00:11:20.839 "state": "configuring", 00:11:20.839 "raid_level": "raid0", 00:11:20.839 "superblock": true, 00:11:20.839 "num_base_bdevs": 2, 00:11:20.839 "num_base_bdevs_discovered": 0, 00:11:20.839 "num_base_bdevs_operational": 2, 00:11:20.839 "base_bdevs_list": [ 00:11:20.839 { 00:11:20.839 "name": "BaseBdev1", 00:11:20.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.839 "is_configured": false, 00:11:20.839 "data_offset": 0, 00:11:20.839 "data_size": 0 00:11:20.839 }, 00:11:20.839 { 00:11:20.839 "name": "BaseBdev2", 00:11:20.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.839 "is_configured": false, 00:11:20.839 "data_offset": 0, 00:11:20.839 "data_size": 0 00:11:20.839 } 00:11:20.839 ] 00:11:20.839 }' 00:11:20.839 22:40:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.839 22:40:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:21.407 22:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:21.666 [2024-07-15 22:40:06.479247] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:21.666 [2024-07-15 22:40:06.479276] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc6a80 name Existed_Raid, state configuring 00:11:21.666 22:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:21.926 [2024-07-15 22:40:06.659747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:21.926 [2024-07-15 22:40:06.659774] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:21.926 [2024-07-15 22:40:06.659783] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:21.926 [2024-07-15 22:40:06.659795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:21.926 22:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:22.186 [2024-07-15 22:40:06.846158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:22.186 BaseBdev1 00:11:22.186 22:40:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:22.186 22:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:22.186 22:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:22.186 22:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:22.186 22:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:22.186 22:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:22.186 22:40:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:22.445 22:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:22.445 [ 00:11:22.445 { 00:11:22.445 "name": "BaseBdev1", 00:11:22.445 "aliases": [ 00:11:22.445 "8dd9a3f2-ee77-42df-a788-3a356f5825ae" 00:11:22.445 ], 00:11:22.445 "product_name": "Malloc disk", 00:11:22.445 "block_size": 512, 00:11:22.445 "num_blocks": 65536, 00:11:22.445 "uuid": "8dd9a3f2-ee77-42df-a788-3a356f5825ae", 00:11:22.445 "assigned_rate_limits": { 00:11:22.445 "rw_ios_per_sec": 0, 00:11:22.445 "rw_mbytes_per_sec": 0, 00:11:22.445 "r_mbytes_per_sec": 0, 00:11:22.445 "w_mbytes_per_sec": 0 00:11:22.445 }, 00:11:22.445 "claimed": true, 00:11:22.445 "claim_type": "exclusive_write", 00:11:22.445 "zoned": false, 00:11:22.445 "supported_io_types": { 00:11:22.445 "read": true, 00:11:22.445 "write": true, 00:11:22.445 "unmap": true, 00:11:22.445 "flush": true, 00:11:22.445 "reset": true, 00:11:22.445 "nvme_admin": false, 00:11:22.445 "nvme_io": false, 00:11:22.445 "nvme_io_md": false, 00:11:22.445 "write_zeroes": true, 00:11:22.445 "zcopy": true, 00:11:22.445 "get_zone_info": false, 00:11:22.445 "zone_management": false, 00:11:22.445 "zone_append": false, 00:11:22.445 "compare": false, 00:11:22.445 "compare_and_write": false, 00:11:22.445 "abort": true, 00:11:22.445 "seek_hole": false, 00:11:22.445 "seek_data": false, 00:11:22.445 "copy": true, 00:11:22.445 "nvme_iov_md": false 00:11:22.445 }, 00:11:22.445 "memory_domains": [ 00:11:22.445 { 00:11:22.445 "dma_device_id": "system", 00:11:22.445 "dma_device_type": 1 00:11:22.445 }, 00:11:22.445 { 00:11:22.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.445 "dma_device_type": 2 00:11:22.445 } 00:11:22.445 ], 00:11:22.446 "driver_specific": {} 00:11:22.446 } 00:11:22.446 ] 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.446 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.705 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.705 "name": "Existed_Raid", 00:11:22.705 "uuid": "c7423e9b-0398-4d80-8c2b-2d9318956334", 00:11:22.705 "strip_size_kb": 64, 00:11:22.705 "state": "configuring", 00:11:22.705 "raid_level": "raid0", 00:11:22.705 "superblock": true, 00:11:22.705 "num_base_bdevs": 2, 00:11:22.705 "num_base_bdevs_discovered": 1, 00:11:22.705 "num_base_bdevs_operational": 2, 00:11:22.705 "base_bdevs_list": [ 00:11:22.705 { 00:11:22.705 "name": "BaseBdev1", 00:11:22.705 "uuid": "8dd9a3f2-ee77-42df-a788-3a356f5825ae", 00:11:22.705 "is_configured": true, 00:11:22.705 "data_offset": 2048, 00:11:22.705 "data_size": 63488 00:11:22.705 }, 00:11:22.705 { 00:11:22.705 "name": "BaseBdev2", 00:11:22.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.705 "is_configured": false, 00:11:22.705 "data_offset": 0, 00:11:22.705 "data_size": 0 00:11:22.705 } 00:11:22.705 ] 00:11:22.705 }' 00:11:22.705 22:40:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.705 22:40:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:23.273 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:23.532 [2024-07-15 22:40:08.237832] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:23.532 [2024-07-15 22:40:08.237866] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc6350 name Existed_Raid, state configuring 00:11:23.532 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:23.532 [2024-07-15 22:40:08.422376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:23.532 [2024-07-15 22:40:08.423867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:23.532 [2024-07-15 22:40:08.423900] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:23.791 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:23.791 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:23.791 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:23.792 "name": "Existed_Raid", 00:11:23.792 "uuid": "7de8058d-7a5e-431f-9927-f95ca1651b16", 00:11:23.792 "strip_size_kb": 64, 00:11:23.792 "state": "configuring", 00:11:23.792 "raid_level": "raid0", 00:11:23.792 "superblock": true, 00:11:23.792 "num_base_bdevs": 2, 00:11:23.792 "num_base_bdevs_discovered": 1, 00:11:23.792 "num_base_bdevs_operational": 2, 00:11:23.792 "base_bdevs_list": [ 00:11:23.792 { 00:11:23.792 "name": "BaseBdev1", 00:11:23.792 "uuid": "8dd9a3f2-ee77-42df-a788-3a356f5825ae", 00:11:23.792 "is_configured": true, 00:11:23.792 "data_offset": 2048, 00:11:23.792 "data_size": 63488 00:11:23.792 }, 00:11:23.792 { 00:11:23.792 "name": "BaseBdev2", 00:11:23.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.792 "is_configured": false, 00:11:23.792 "data_offset": 0, 00:11:23.792 "data_size": 0 00:11:23.792 } 00:11:23.792 ] 00:11:23.792 }' 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:23.792 22:40:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:24.359 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:24.618 [2024-07-15 22:40:09.400293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:24.618 [2024-07-15 22:40:09.400439] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcc7000 00:11:24.618 [2024-07-15 22:40:09.400453] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:24.618 [2024-07-15 22:40:09.400623] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbe10c0 00:11:24.618 [2024-07-15 22:40:09.400737] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcc7000 00:11:24.618 [2024-07-15 22:40:09.400747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcc7000 00:11:24.618 [2024-07-15 22:40:09.400836] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:24.618 BaseBdev2 00:11:24.618 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:24.618 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:24.618 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.618 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:24.618 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.618 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.618 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:24.878 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:25.137 [ 00:11:25.137 { 00:11:25.137 "name": "BaseBdev2", 00:11:25.137 "aliases": [ 00:11:25.137 "267951c4-0656-42f5-9290-930bf2955a89" 00:11:25.137 ], 00:11:25.137 "product_name": "Malloc disk", 00:11:25.137 "block_size": 512, 00:11:25.137 "num_blocks": 65536, 00:11:25.137 "uuid": "267951c4-0656-42f5-9290-930bf2955a89", 00:11:25.137 "assigned_rate_limits": { 00:11:25.137 "rw_ios_per_sec": 0, 00:11:25.137 "rw_mbytes_per_sec": 0, 00:11:25.137 "r_mbytes_per_sec": 0, 00:11:25.137 "w_mbytes_per_sec": 0 00:11:25.137 }, 00:11:25.137 "claimed": true, 00:11:25.137 "claim_type": "exclusive_write", 00:11:25.137 "zoned": false, 00:11:25.137 "supported_io_types": { 00:11:25.137 "read": true, 00:11:25.137 "write": true, 00:11:25.137 "unmap": true, 00:11:25.137 "flush": true, 00:11:25.137 "reset": true, 00:11:25.137 "nvme_admin": false, 00:11:25.137 "nvme_io": false, 00:11:25.137 "nvme_io_md": false, 00:11:25.137 "write_zeroes": true, 00:11:25.137 "zcopy": true, 00:11:25.137 "get_zone_info": false, 00:11:25.137 "zone_management": false, 00:11:25.137 "zone_append": false, 00:11:25.137 "compare": false, 00:11:25.137 "compare_and_write": false, 00:11:25.137 "abort": true, 00:11:25.137 "seek_hole": false, 00:11:25.137 "seek_data": false, 00:11:25.137 "copy": true, 00:11:25.137 "nvme_iov_md": false 00:11:25.137 }, 00:11:25.137 "memory_domains": [ 00:11:25.137 { 00:11:25.137 "dma_device_id": "system", 00:11:25.137 "dma_device_type": 1 00:11:25.137 }, 00:11:25.137 { 00:11:25.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:25.137 "dma_device_type": 2 00:11:25.137 } 00:11:25.137 ], 00:11:25.137 "driver_specific": {} 00:11:25.137 } 00:11:25.137 ] 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.137 22:40:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.137 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.137 "name": "Existed_Raid", 00:11:25.137 "uuid": "7de8058d-7a5e-431f-9927-f95ca1651b16", 00:11:25.137 "strip_size_kb": 64, 00:11:25.137 "state": "online", 00:11:25.137 "raid_level": "raid0", 00:11:25.137 "superblock": true, 00:11:25.137 "num_base_bdevs": 2, 00:11:25.137 "num_base_bdevs_discovered": 2, 00:11:25.137 "num_base_bdevs_operational": 2, 00:11:25.137 "base_bdevs_list": [ 00:11:25.137 { 00:11:25.137 "name": "BaseBdev1", 00:11:25.137 "uuid": "8dd9a3f2-ee77-42df-a788-3a356f5825ae", 00:11:25.137 "is_configured": true, 00:11:25.137 "data_offset": 2048, 00:11:25.137 "data_size": 63488 00:11:25.137 }, 00:11:25.137 { 00:11:25.137 "name": "BaseBdev2", 00:11:25.137 "uuid": "267951c4-0656-42f5-9290-930bf2955a89", 00:11:25.137 "is_configured": true, 00:11:25.137 "data_offset": 2048, 00:11:25.137 "data_size": 63488 00:11:25.137 } 00:11:25.137 ] 00:11:25.137 }' 00:11:25.137 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.137 22:40:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:26.075 [2024-07-15 22:40:10.868456] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:26.075 "name": "Existed_Raid", 00:11:26.075 "aliases": [ 00:11:26.075 "7de8058d-7a5e-431f-9927-f95ca1651b16" 00:11:26.075 ], 00:11:26.075 "product_name": "Raid Volume", 00:11:26.075 "block_size": 512, 00:11:26.075 "num_blocks": 126976, 00:11:26.075 "uuid": "7de8058d-7a5e-431f-9927-f95ca1651b16", 00:11:26.075 "assigned_rate_limits": { 00:11:26.075 "rw_ios_per_sec": 0, 00:11:26.075 "rw_mbytes_per_sec": 0, 00:11:26.075 "r_mbytes_per_sec": 0, 00:11:26.075 "w_mbytes_per_sec": 0 00:11:26.075 }, 00:11:26.075 "claimed": false, 00:11:26.075 "zoned": false, 00:11:26.075 "supported_io_types": { 00:11:26.075 "read": true, 00:11:26.075 "write": true, 00:11:26.075 "unmap": true, 00:11:26.075 "flush": true, 00:11:26.075 "reset": true, 00:11:26.075 "nvme_admin": false, 00:11:26.075 "nvme_io": false, 00:11:26.075 "nvme_io_md": false, 00:11:26.075 "write_zeroes": true, 00:11:26.075 "zcopy": false, 00:11:26.075 "get_zone_info": false, 00:11:26.075 "zone_management": false, 00:11:26.075 "zone_append": false, 00:11:26.075 "compare": false, 00:11:26.075 "compare_and_write": false, 00:11:26.075 "abort": false, 00:11:26.075 "seek_hole": false, 00:11:26.075 "seek_data": false, 00:11:26.075 "copy": false, 00:11:26.075 "nvme_iov_md": false 00:11:26.075 }, 00:11:26.075 "memory_domains": [ 00:11:26.075 { 00:11:26.075 "dma_device_id": "system", 00:11:26.075 "dma_device_type": 1 00:11:26.075 }, 00:11:26.075 { 00:11:26.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.075 "dma_device_type": 2 00:11:26.075 }, 00:11:26.075 { 00:11:26.075 "dma_device_id": "system", 00:11:26.075 "dma_device_type": 1 00:11:26.075 }, 00:11:26.075 { 00:11:26.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.075 "dma_device_type": 2 00:11:26.075 } 00:11:26.075 ], 00:11:26.075 "driver_specific": { 00:11:26.075 "raid": { 00:11:26.075 "uuid": "7de8058d-7a5e-431f-9927-f95ca1651b16", 00:11:26.075 "strip_size_kb": 64, 00:11:26.075 "state": "online", 00:11:26.075 "raid_level": "raid0", 00:11:26.075 "superblock": true, 00:11:26.075 "num_base_bdevs": 2, 00:11:26.075 "num_base_bdevs_discovered": 2, 00:11:26.075 "num_base_bdevs_operational": 2, 00:11:26.075 "base_bdevs_list": [ 00:11:26.075 { 00:11:26.075 "name": "BaseBdev1", 00:11:26.075 "uuid": "8dd9a3f2-ee77-42df-a788-3a356f5825ae", 00:11:26.075 "is_configured": true, 00:11:26.075 "data_offset": 2048, 00:11:26.075 "data_size": 63488 00:11:26.075 }, 00:11:26.075 { 00:11:26.075 "name": "BaseBdev2", 00:11:26.075 "uuid": "267951c4-0656-42f5-9290-930bf2955a89", 00:11:26.075 "is_configured": true, 00:11:26.075 "data_offset": 2048, 00:11:26.075 "data_size": 63488 00:11:26.075 } 00:11:26.075 ] 00:11:26.075 } 00:11:26.075 } 00:11:26.075 }' 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:26.075 BaseBdev2' 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:26.075 22:40:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:26.334 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:26.334 "name": "BaseBdev1", 00:11:26.334 "aliases": [ 00:11:26.334 "8dd9a3f2-ee77-42df-a788-3a356f5825ae" 00:11:26.334 ], 00:11:26.334 "product_name": "Malloc disk", 00:11:26.334 "block_size": 512, 00:11:26.334 "num_blocks": 65536, 00:11:26.334 "uuid": "8dd9a3f2-ee77-42df-a788-3a356f5825ae", 00:11:26.334 "assigned_rate_limits": { 00:11:26.334 "rw_ios_per_sec": 0, 00:11:26.334 "rw_mbytes_per_sec": 0, 00:11:26.334 "r_mbytes_per_sec": 0, 00:11:26.334 "w_mbytes_per_sec": 0 00:11:26.334 }, 00:11:26.334 "claimed": true, 00:11:26.334 "claim_type": "exclusive_write", 00:11:26.334 "zoned": false, 00:11:26.334 "supported_io_types": { 00:11:26.334 "read": true, 00:11:26.334 "write": true, 00:11:26.334 "unmap": true, 00:11:26.334 "flush": true, 00:11:26.334 "reset": true, 00:11:26.334 "nvme_admin": false, 00:11:26.334 "nvme_io": false, 00:11:26.334 "nvme_io_md": false, 00:11:26.334 "write_zeroes": true, 00:11:26.334 "zcopy": true, 00:11:26.334 "get_zone_info": false, 00:11:26.334 "zone_management": false, 00:11:26.334 "zone_append": false, 00:11:26.334 "compare": false, 00:11:26.334 "compare_and_write": false, 00:11:26.334 "abort": true, 00:11:26.334 "seek_hole": false, 00:11:26.334 "seek_data": false, 00:11:26.334 "copy": true, 00:11:26.334 "nvme_iov_md": false 00:11:26.334 }, 00:11:26.334 "memory_domains": [ 00:11:26.334 { 00:11:26.334 "dma_device_id": "system", 00:11:26.334 "dma_device_type": 1 00:11:26.334 }, 00:11:26.334 { 00:11:26.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:26.334 "dma_device_type": 2 00:11:26.334 } 00:11:26.334 ], 00:11:26.334 "driver_specific": {} 00:11:26.334 }' 00:11:26.334 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:26.593 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:26.852 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:26.852 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:26.852 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:26.852 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:26.852 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:27.111 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:27.111 "name": "BaseBdev2", 00:11:27.111 "aliases": [ 00:11:27.111 "267951c4-0656-42f5-9290-930bf2955a89" 00:11:27.111 ], 00:11:27.111 "product_name": "Malloc disk", 00:11:27.111 "block_size": 512, 00:11:27.111 "num_blocks": 65536, 00:11:27.111 "uuid": "267951c4-0656-42f5-9290-930bf2955a89", 00:11:27.111 "assigned_rate_limits": { 00:11:27.111 "rw_ios_per_sec": 0, 00:11:27.111 "rw_mbytes_per_sec": 0, 00:11:27.111 "r_mbytes_per_sec": 0, 00:11:27.111 "w_mbytes_per_sec": 0 00:11:27.111 }, 00:11:27.111 "claimed": true, 00:11:27.111 "claim_type": "exclusive_write", 00:11:27.111 "zoned": false, 00:11:27.111 "supported_io_types": { 00:11:27.111 "read": true, 00:11:27.111 "write": true, 00:11:27.111 "unmap": true, 00:11:27.111 "flush": true, 00:11:27.111 "reset": true, 00:11:27.111 "nvme_admin": false, 00:11:27.111 "nvme_io": false, 00:11:27.111 "nvme_io_md": false, 00:11:27.111 "write_zeroes": true, 00:11:27.111 "zcopy": true, 00:11:27.111 "get_zone_info": false, 00:11:27.111 "zone_management": false, 00:11:27.111 "zone_append": false, 00:11:27.111 "compare": false, 00:11:27.111 "compare_and_write": false, 00:11:27.111 "abort": true, 00:11:27.111 "seek_hole": false, 00:11:27.111 "seek_data": false, 00:11:27.111 "copy": true, 00:11:27.111 "nvme_iov_md": false 00:11:27.111 }, 00:11:27.111 "memory_domains": [ 00:11:27.111 { 00:11:27.111 "dma_device_id": "system", 00:11:27.111 "dma_device_type": 1 00:11:27.111 }, 00:11:27.111 { 00:11:27.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:27.111 "dma_device_type": 2 00:11:27.111 } 00:11:27.111 ], 00:11:27.111 "driver_specific": {} 00:11:27.111 }' 00:11:27.111 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.111 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:27.111 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:27.111 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.111 22:40:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:27.371 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:27.371 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.371 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:27.371 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:27.371 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.630 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:27.630 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:27.630 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:28.198 [2024-07-15 22:40:12.833457] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:28.198 [2024-07-15 22:40:12.833485] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:28.198 [2024-07-15 22:40:12.833529] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.198 22:40:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.771 22:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.771 "name": "Existed_Raid", 00:11:28.771 "uuid": "7de8058d-7a5e-431f-9927-f95ca1651b16", 00:11:28.771 "strip_size_kb": 64, 00:11:28.771 "state": "offline", 00:11:28.771 "raid_level": "raid0", 00:11:28.771 "superblock": true, 00:11:28.771 "num_base_bdevs": 2, 00:11:28.771 "num_base_bdevs_discovered": 1, 00:11:28.771 "num_base_bdevs_operational": 1, 00:11:28.771 "base_bdevs_list": [ 00:11:28.771 { 00:11:28.771 "name": null, 00:11:28.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.771 "is_configured": false, 00:11:28.771 "data_offset": 2048, 00:11:28.771 "data_size": 63488 00:11:28.771 }, 00:11:28.771 { 00:11:28.771 "name": "BaseBdev2", 00:11:28.771 "uuid": "267951c4-0656-42f5-9290-930bf2955a89", 00:11:28.771 "is_configured": true, 00:11:28.771 "data_offset": 2048, 00:11:28.771 "data_size": 63488 00:11:28.771 } 00:11:28.771 ] 00:11:28.771 }' 00:11:28.771 22:40:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.771 22:40:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:29.339 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:29.339 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:29.598 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.598 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:29.598 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:29.598 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:29.598 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:29.857 [2024-07-15 22:40:14.720336] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:29.857 [2024-07-15 22:40:14.720385] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcc7000 name Existed_Raid, state offline 00:11:29.857 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:29.857 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:29.857 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.857 22:40:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2698120 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2698120 ']' 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2698120 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2698120 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2698120' 00:11:30.425 killing process with pid 2698120 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2698120 00:11:30.425 [2024-07-15 22:40:15.324865] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:30.425 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2698120 00:11:30.425 [2024-07-15 22:40:15.325834] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:30.684 22:40:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:30.684 00:11:30.684 real 0m11.337s 00:11:30.684 user 0m20.190s 00:11:30.684 sys 0m2.078s 00:11:30.684 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:30.684 22:40:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:30.684 ************************************ 00:11:30.684 END TEST raid_state_function_test_sb 00:11:30.684 ************************************ 00:11:30.684 22:40:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:30.684 22:40:15 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:30.684 22:40:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:30.684 22:40:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:30.684 22:40:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:30.943 ************************************ 00:11:30.943 START TEST raid_superblock_test 00:11:30.943 ************************************ 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2699800 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2699800 /var/tmp/spdk-raid.sock 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2699800 ']' 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:30.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:30.943 22:40:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.943 [2024-07-15 22:40:15.695699] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:30.943 [2024-07-15 22:40:15.695768] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2699800 ] 00:11:30.943 [2024-07-15 22:40:15.822992] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:31.202 [2024-07-15 22:40:15.930117] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:31.202 [2024-07-15 22:40:15.995734] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:31.202 [2024-07-15 22:40:15.995769] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:32.138 22:40:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:32.716 malloc1 00:11:32.716 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:33.289 [2024-07-15 22:40:17.896512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:33.289 [2024-07-15 22:40:17.896565] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:33.289 [2024-07-15 22:40:17.896587] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe48570 00:11:33.289 [2024-07-15 22:40:17.896599] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:33.289 [2024-07-15 22:40:17.898400] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:33.289 [2024-07-15 22:40:17.898428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:33.289 pt1 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:33.289 22:40:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:33.289 malloc2 00:11:33.289 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:33.857 [2024-07-15 22:40:18.663248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:33.857 [2024-07-15 22:40:18.663294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:33.857 [2024-07-15 22:40:18.663313] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe49970 00:11:33.857 [2024-07-15 22:40:18.663325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:33.857 [2024-07-15 22:40:18.664999] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:33.857 [2024-07-15 22:40:18.665026] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:33.857 pt2 00:11:33.857 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:33.857 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:33.857 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:34.117 [2024-07-15 22:40:18.968071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:34.117 [2024-07-15 22:40:18.969414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:34.117 [2024-07-15 22:40:18.969568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfec270 00:11:34.117 [2024-07-15 22:40:18.969581] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:34.117 [2024-07-15 22:40:18.969780] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfe1c10 00:11:34.117 [2024-07-15 22:40:18.969938] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfec270 00:11:34.117 [2024-07-15 22:40:18.969949] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfec270 00:11:34.117 [2024-07-15 22:40:18.970049] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.117 22:40:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:34.686 22:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.686 "name": "raid_bdev1", 00:11:34.686 "uuid": "c1ce97da-79d4-43a8-a878-352fe1e117dd", 00:11:34.686 "strip_size_kb": 64, 00:11:34.686 "state": "online", 00:11:34.686 "raid_level": "raid0", 00:11:34.686 "superblock": true, 00:11:34.686 "num_base_bdevs": 2, 00:11:34.686 "num_base_bdevs_discovered": 2, 00:11:34.686 "num_base_bdevs_operational": 2, 00:11:34.686 "base_bdevs_list": [ 00:11:34.686 { 00:11:34.686 "name": "pt1", 00:11:34.686 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:34.686 "is_configured": true, 00:11:34.686 "data_offset": 2048, 00:11:34.686 "data_size": 63488 00:11:34.686 }, 00:11:34.686 { 00:11:34.686 "name": "pt2", 00:11:34.686 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:34.686 "is_configured": true, 00:11:34.686 "data_offset": 2048, 00:11:34.686 "data_size": 63488 00:11:34.686 } 00:11:34.686 ] 00:11:34.686 }' 00:11:34.686 22:40:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.686 22:40:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:35.704 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:35.704 [2024-07-15 22:40:20.600608] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:35.963 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:35.963 "name": "raid_bdev1", 00:11:35.963 "aliases": [ 00:11:35.963 "c1ce97da-79d4-43a8-a878-352fe1e117dd" 00:11:35.963 ], 00:11:35.963 "product_name": "Raid Volume", 00:11:35.963 "block_size": 512, 00:11:35.963 "num_blocks": 126976, 00:11:35.963 "uuid": "c1ce97da-79d4-43a8-a878-352fe1e117dd", 00:11:35.963 "assigned_rate_limits": { 00:11:35.963 "rw_ios_per_sec": 0, 00:11:35.963 "rw_mbytes_per_sec": 0, 00:11:35.963 "r_mbytes_per_sec": 0, 00:11:35.964 "w_mbytes_per_sec": 0 00:11:35.964 }, 00:11:35.964 "claimed": false, 00:11:35.964 "zoned": false, 00:11:35.964 "supported_io_types": { 00:11:35.964 "read": true, 00:11:35.964 "write": true, 00:11:35.964 "unmap": true, 00:11:35.964 "flush": true, 00:11:35.964 "reset": true, 00:11:35.964 "nvme_admin": false, 00:11:35.964 "nvme_io": false, 00:11:35.964 "nvme_io_md": false, 00:11:35.964 "write_zeroes": true, 00:11:35.964 "zcopy": false, 00:11:35.964 "get_zone_info": false, 00:11:35.964 "zone_management": false, 00:11:35.964 "zone_append": false, 00:11:35.964 "compare": false, 00:11:35.964 "compare_and_write": false, 00:11:35.964 "abort": false, 00:11:35.964 "seek_hole": false, 00:11:35.964 "seek_data": false, 00:11:35.964 "copy": false, 00:11:35.964 "nvme_iov_md": false 00:11:35.964 }, 00:11:35.964 "memory_domains": [ 00:11:35.964 { 00:11:35.964 "dma_device_id": "system", 00:11:35.964 "dma_device_type": 1 00:11:35.964 }, 00:11:35.964 { 00:11:35.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:35.964 "dma_device_type": 2 00:11:35.964 }, 00:11:35.964 { 00:11:35.964 "dma_device_id": "system", 00:11:35.964 "dma_device_type": 1 00:11:35.964 }, 00:11:35.964 { 00:11:35.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:35.964 "dma_device_type": 2 00:11:35.964 } 00:11:35.964 ], 00:11:35.964 "driver_specific": { 00:11:35.964 "raid": { 00:11:35.964 "uuid": "c1ce97da-79d4-43a8-a878-352fe1e117dd", 00:11:35.964 "strip_size_kb": 64, 00:11:35.964 "state": "online", 00:11:35.964 "raid_level": "raid0", 00:11:35.964 "superblock": true, 00:11:35.964 "num_base_bdevs": 2, 00:11:35.964 "num_base_bdevs_discovered": 2, 00:11:35.964 "num_base_bdevs_operational": 2, 00:11:35.964 "base_bdevs_list": [ 00:11:35.964 { 00:11:35.964 "name": "pt1", 00:11:35.964 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:35.964 "is_configured": true, 00:11:35.964 "data_offset": 2048, 00:11:35.964 "data_size": 63488 00:11:35.964 }, 00:11:35.964 { 00:11:35.964 "name": "pt2", 00:11:35.964 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:35.964 "is_configured": true, 00:11:35.964 "data_offset": 2048, 00:11:35.964 "data_size": 63488 00:11:35.964 } 00:11:35.964 ] 00:11:35.964 } 00:11:35.964 } 00:11:35.964 }' 00:11:35.964 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:35.964 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:35.964 pt2' 00:11:35.964 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:35.964 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:35.964 22:40:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:36.529 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:36.529 "name": "pt1", 00:11:36.529 "aliases": [ 00:11:36.529 "00000000-0000-0000-0000-000000000001" 00:11:36.529 ], 00:11:36.529 "product_name": "passthru", 00:11:36.529 "block_size": 512, 00:11:36.529 "num_blocks": 65536, 00:11:36.529 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:36.529 "assigned_rate_limits": { 00:11:36.529 "rw_ios_per_sec": 0, 00:11:36.529 "rw_mbytes_per_sec": 0, 00:11:36.529 "r_mbytes_per_sec": 0, 00:11:36.529 "w_mbytes_per_sec": 0 00:11:36.529 }, 00:11:36.529 "claimed": true, 00:11:36.529 "claim_type": "exclusive_write", 00:11:36.529 "zoned": false, 00:11:36.529 "supported_io_types": { 00:11:36.529 "read": true, 00:11:36.529 "write": true, 00:11:36.529 "unmap": true, 00:11:36.529 "flush": true, 00:11:36.529 "reset": true, 00:11:36.529 "nvme_admin": false, 00:11:36.529 "nvme_io": false, 00:11:36.529 "nvme_io_md": false, 00:11:36.529 "write_zeroes": true, 00:11:36.529 "zcopy": true, 00:11:36.529 "get_zone_info": false, 00:11:36.529 "zone_management": false, 00:11:36.529 "zone_append": false, 00:11:36.529 "compare": false, 00:11:36.529 "compare_and_write": false, 00:11:36.529 "abort": true, 00:11:36.529 "seek_hole": false, 00:11:36.529 "seek_data": false, 00:11:36.529 "copy": true, 00:11:36.529 "nvme_iov_md": false 00:11:36.529 }, 00:11:36.529 "memory_domains": [ 00:11:36.529 { 00:11:36.529 "dma_device_id": "system", 00:11:36.529 "dma_device_type": 1 00:11:36.529 }, 00:11:36.529 { 00:11:36.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.529 "dma_device_type": 2 00:11:36.529 } 00:11:36.529 ], 00:11:36.529 "driver_specific": { 00:11:36.529 "passthru": { 00:11:36.529 "name": "pt1", 00:11:36.529 "base_bdev_name": "malloc1" 00:11:36.529 } 00:11:36.529 } 00:11:36.529 }' 00:11:36.529 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:36.530 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:36.530 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:36.530 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:36.530 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:36.787 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:37.045 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:37.045 "name": "pt2", 00:11:37.045 "aliases": [ 00:11:37.045 "00000000-0000-0000-0000-000000000002" 00:11:37.045 ], 00:11:37.045 "product_name": "passthru", 00:11:37.045 "block_size": 512, 00:11:37.045 "num_blocks": 65536, 00:11:37.045 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:37.045 "assigned_rate_limits": { 00:11:37.045 "rw_ios_per_sec": 0, 00:11:37.045 "rw_mbytes_per_sec": 0, 00:11:37.045 "r_mbytes_per_sec": 0, 00:11:37.045 "w_mbytes_per_sec": 0 00:11:37.045 }, 00:11:37.045 "claimed": true, 00:11:37.045 "claim_type": "exclusive_write", 00:11:37.045 "zoned": false, 00:11:37.045 "supported_io_types": { 00:11:37.045 "read": true, 00:11:37.045 "write": true, 00:11:37.045 "unmap": true, 00:11:37.045 "flush": true, 00:11:37.045 "reset": true, 00:11:37.045 "nvme_admin": false, 00:11:37.045 "nvme_io": false, 00:11:37.045 "nvme_io_md": false, 00:11:37.045 "write_zeroes": true, 00:11:37.045 "zcopy": true, 00:11:37.045 "get_zone_info": false, 00:11:37.045 "zone_management": false, 00:11:37.045 "zone_append": false, 00:11:37.045 "compare": false, 00:11:37.045 "compare_and_write": false, 00:11:37.045 "abort": true, 00:11:37.045 "seek_hole": false, 00:11:37.045 "seek_data": false, 00:11:37.045 "copy": true, 00:11:37.045 "nvme_iov_md": false 00:11:37.045 }, 00:11:37.045 "memory_domains": [ 00:11:37.045 { 00:11:37.045 "dma_device_id": "system", 00:11:37.045 "dma_device_type": 1 00:11:37.045 }, 00:11:37.045 { 00:11:37.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:37.045 "dma_device_type": 2 00:11:37.045 } 00:11:37.045 ], 00:11:37.045 "driver_specific": { 00:11:37.045 "passthru": { 00:11:37.045 "name": "pt2", 00:11:37.045 "base_bdev_name": "malloc2" 00:11:37.045 } 00:11:37.045 } 00:11:37.045 }' 00:11:37.045 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:37.303 22:40:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:37.303 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:37.303 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:37.303 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:37.303 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:37.303 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:37.303 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:37.561 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:37.561 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:37.561 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:37.561 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:37.561 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:37.561 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:38.125 [2024-07-15 22:40:22.862590] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:38.125 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c1ce97da-79d4-43a8-a878-352fe1e117dd 00:11:38.125 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c1ce97da-79d4-43a8-a878-352fe1e117dd ']' 00:11:38.125 22:40:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:38.383 [2024-07-15 22:40:23.167157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:38.383 [2024-07-15 22:40:23.167178] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:38.383 [2024-07-15 22:40:23.167231] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:38.383 [2024-07-15 22:40:23.167275] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:38.383 [2024-07-15 22:40:23.167287] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfec270 name raid_bdev1, state offline 00:11:38.383 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.383 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:38.947 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:38.947 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:38.947 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:38.947 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:39.206 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:39.206 22:40:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:39.464 22:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:39.464 22:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:40.031 22:40:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:40.289 [2024-07-15 22:40:25.036025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:40.289 [2024-07-15 22:40:25.037369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:40.289 [2024-07-15 22:40:25.037425] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:40.289 [2024-07-15 22:40:25.037466] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:40.289 [2024-07-15 22:40:25.037485] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:40.289 [2024-07-15 22:40:25.037495] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfebff0 name raid_bdev1, state configuring 00:11:40.289 request: 00:11:40.289 { 00:11:40.289 "name": "raid_bdev1", 00:11:40.289 "raid_level": "raid0", 00:11:40.289 "base_bdevs": [ 00:11:40.289 "malloc1", 00:11:40.289 "malloc2" 00:11:40.289 ], 00:11:40.289 "strip_size_kb": 64, 00:11:40.289 "superblock": false, 00:11:40.289 "method": "bdev_raid_create", 00:11:40.289 "req_id": 1 00:11:40.289 } 00:11:40.289 Got JSON-RPC error response 00:11:40.289 response: 00:11:40.289 { 00:11:40.289 "code": -17, 00:11:40.289 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:40.289 } 00:11:40.289 22:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:40.289 22:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:40.289 22:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:40.289 22:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:40.289 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:40.289 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.548 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:40.548 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:40.548 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:40.807 [2024-07-15 22:40:25.533275] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:40.807 [2024-07-15 22:40:25.533308] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:40.807 [2024-07-15 22:40:25.533327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe487a0 00:11:40.807 [2024-07-15 22:40:25.533339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:40.807 [2024-07-15 22:40:25.534804] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:40.807 [2024-07-15 22:40:25.534831] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:40.807 [2024-07-15 22:40:25.534885] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:40.807 [2024-07-15 22:40:25.534909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:40.807 pt1 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.807 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:41.066 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:41.066 "name": "raid_bdev1", 00:11:41.066 "uuid": "c1ce97da-79d4-43a8-a878-352fe1e117dd", 00:11:41.066 "strip_size_kb": 64, 00:11:41.066 "state": "configuring", 00:11:41.066 "raid_level": "raid0", 00:11:41.066 "superblock": true, 00:11:41.066 "num_base_bdevs": 2, 00:11:41.066 "num_base_bdevs_discovered": 1, 00:11:41.066 "num_base_bdevs_operational": 2, 00:11:41.066 "base_bdevs_list": [ 00:11:41.066 { 00:11:41.066 "name": "pt1", 00:11:41.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:41.066 "is_configured": true, 00:11:41.066 "data_offset": 2048, 00:11:41.066 "data_size": 63488 00:11:41.066 }, 00:11:41.066 { 00:11:41.066 "name": null, 00:11:41.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:41.066 "is_configured": false, 00:11:41.066 "data_offset": 2048, 00:11:41.066 "data_size": 63488 00:11:41.066 } 00:11:41.066 ] 00:11:41.066 }' 00:11:41.066 22:40:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:41.066 22:40:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:42.002 [2024-07-15 22:40:26.860807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:42.002 [2024-07-15 22:40:26.860855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:42.002 [2024-07-15 22:40:26.860873] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe2820 00:11:42.002 [2024-07-15 22:40:26.860885] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:42.002 [2024-07-15 22:40:26.861235] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:42.002 [2024-07-15 22:40:26.861254] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:42.002 [2024-07-15 22:40:26.861313] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:42.002 [2024-07-15 22:40:26.861332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:42.002 [2024-07-15 22:40:26.861426] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe3eec0 00:11:42.002 [2024-07-15 22:40:26.861437] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:42.002 [2024-07-15 22:40:26.861605] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe41530 00:11:42.002 [2024-07-15 22:40:26.861724] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe3eec0 00:11:42.002 [2024-07-15 22:40:26.861734] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe3eec0 00:11:42.002 [2024-07-15 22:40:26.861830] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:42.002 pt2 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.002 22:40:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:42.260 22:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.260 "name": "raid_bdev1", 00:11:42.260 "uuid": "c1ce97da-79d4-43a8-a878-352fe1e117dd", 00:11:42.260 "strip_size_kb": 64, 00:11:42.260 "state": "online", 00:11:42.260 "raid_level": "raid0", 00:11:42.260 "superblock": true, 00:11:42.260 "num_base_bdevs": 2, 00:11:42.260 "num_base_bdevs_discovered": 2, 00:11:42.260 "num_base_bdevs_operational": 2, 00:11:42.260 "base_bdevs_list": [ 00:11:42.260 { 00:11:42.260 "name": "pt1", 00:11:42.260 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:42.260 "is_configured": true, 00:11:42.260 "data_offset": 2048, 00:11:42.260 "data_size": 63488 00:11:42.260 }, 00:11:42.260 { 00:11:42.260 "name": "pt2", 00:11:42.260 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:42.261 "is_configured": true, 00:11:42.261 "data_offset": 2048, 00:11:42.261 "data_size": 63488 00:11:42.261 } 00:11:42.261 ] 00:11:42.261 }' 00:11:42.261 22:40:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.261 22:40:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:43.196 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:43.455 [2024-07-15 22:40:28.244723] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:43.455 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:43.455 "name": "raid_bdev1", 00:11:43.455 "aliases": [ 00:11:43.455 "c1ce97da-79d4-43a8-a878-352fe1e117dd" 00:11:43.455 ], 00:11:43.455 "product_name": "Raid Volume", 00:11:43.455 "block_size": 512, 00:11:43.455 "num_blocks": 126976, 00:11:43.455 "uuid": "c1ce97da-79d4-43a8-a878-352fe1e117dd", 00:11:43.455 "assigned_rate_limits": { 00:11:43.455 "rw_ios_per_sec": 0, 00:11:43.455 "rw_mbytes_per_sec": 0, 00:11:43.455 "r_mbytes_per_sec": 0, 00:11:43.455 "w_mbytes_per_sec": 0 00:11:43.455 }, 00:11:43.455 "claimed": false, 00:11:43.455 "zoned": false, 00:11:43.455 "supported_io_types": { 00:11:43.455 "read": true, 00:11:43.455 "write": true, 00:11:43.455 "unmap": true, 00:11:43.455 "flush": true, 00:11:43.455 "reset": true, 00:11:43.455 "nvme_admin": false, 00:11:43.455 "nvme_io": false, 00:11:43.455 "nvme_io_md": false, 00:11:43.455 "write_zeroes": true, 00:11:43.455 "zcopy": false, 00:11:43.455 "get_zone_info": false, 00:11:43.455 "zone_management": false, 00:11:43.455 "zone_append": false, 00:11:43.455 "compare": false, 00:11:43.455 "compare_and_write": false, 00:11:43.455 "abort": false, 00:11:43.455 "seek_hole": false, 00:11:43.455 "seek_data": false, 00:11:43.455 "copy": false, 00:11:43.455 "nvme_iov_md": false 00:11:43.455 }, 00:11:43.455 "memory_domains": [ 00:11:43.455 { 00:11:43.455 "dma_device_id": "system", 00:11:43.455 "dma_device_type": 1 00:11:43.455 }, 00:11:43.455 { 00:11:43.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.455 "dma_device_type": 2 00:11:43.455 }, 00:11:43.455 { 00:11:43.455 "dma_device_id": "system", 00:11:43.455 "dma_device_type": 1 00:11:43.455 }, 00:11:43.455 { 00:11:43.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.455 "dma_device_type": 2 00:11:43.455 } 00:11:43.455 ], 00:11:43.455 "driver_specific": { 00:11:43.455 "raid": { 00:11:43.455 "uuid": "c1ce97da-79d4-43a8-a878-352fe1e117dd", 00:11:43.455 "strip_size_kb": 64, 00:11:43.455 "state": "online", 00:11:43.455 "raid_level": "raid0", 00:11:43.455 "superblock": true, 00:11:43.455 "num_base_bdevs": 2, 00:11:43.455 "num_base_bdevs_discovered": 2, 00:11:43.455 "num_base_bdevs_operational": 2, 00:11:43.455 "base_bdevs_list": [ 00:11:43.455 { 00:11:43.455 "name": "pt1", 00:11:43.455 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:43.455 "is_configured": true, 00:11:43.455 "data_offset": 2048, 00:11:43.455 "data_size": 63488 00:11:43.455 }, 00:11:43.455 { 00:11:43.455 "name": "pt2", 00:11:43.455 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:43.455 "is_configured": true, 00:11:43.455 "data_offset": 2048, 00:11:43.455 "data_size": 63488 00:11:43.455 } 00:11:43.455 ] 00:11:43.455 } 00:11:43.455 } 00:11:43.455 }' 00:11:43.455 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:43.455 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:43.455 pt2' 00:11:43.455 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:43.455 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:43.455 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:43.713 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:43.713 "name": "pt1", 00:11:43.713 "aliases": [ 00:11:43.713 "00000000-0000-0000-0000-000000000001" 00:11:43.713 ], 00:11:43.713 "product_name": "passthru", 00:11:43.713 "block_size": 512, 00:11:43.713 "num_blocks": 65536, 00:11:43.713 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:43.713 "assigned_rate_limits": { 00:11:43.713 "rw_ios_per_sec": 0, 00:11:43.713 "rw_mbytes_per_sec": 0, 00:11:43.713 "r_mbytes_per_sec": 0, 00:11:43.713 "w_mbytes_per_sec": 0 00:11:43.713 }, 00:11:43.713 "claimed": true, 00:11:43.713 "claim_type": "exclusive_write", 00:11:43.713 "zoned": false, 00:11:43.713 "supported_io_types": { 00:11:43.713 "read": true, 00:11:43.713 "write": true, 00:11:43.713 "unmap": true, 00:11:43.713 "flush": true, 00:11:43.713 "reset": true, 00:11:43.713 "nvme_admin": false, 00:11:43.713 "nvme_io": false, 00:11:43.713 "nvme_io_md": false, 00:11:43.713 "write_zeroes": true, 00:11:43.713 "zcopy": true, 00:11:43.713 "get_zone_info": false, 00:11:43.713 "zone_management": false, 00:11:43.713 "zone_append": false, 00:11:43.713 "compare": false, 00:11:43.713 "compare_and_write": false, 00:11:43.713 "abort": true, 00:11:43.713 "seek_hole": false, 00:11:43.713 "seek_data": false, 00:11:43.713 "copy": true, 00:11:43.713 "nvme_iov_md": false 00:11:43.713 }, 00:11:43.713 "memory_domains": [ 00:11:43.713 { 00:11:43.713 "dma_device_id": "system", 00:11:43.713 "dma_device_type": 1 00:11:43.713 }, 00:11:43.713 { 00:11:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.713 "dma_device_type": 2 00:11:43.713 } 00:11:43.713 ], 00:11:43.713 "driver_specific": { 00:11:43.713 "passthru": { 00:11:43.713 "name": "pt1", 00:11:43.713 "base_bdev_name": "malloc1" 00:11:43.713 } 00:11:43.713 } 00:11:43.713 }' 00:11:43.713 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.972 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.231 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.231 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:44.231 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:44.231 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:44.231 22:40:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:44.491 "name": "pt2", 00:11:44.491 "aliases": [ 00:11:44.491 "00000000-0000-0000-0000-000000000002" 00:11:44.491 ], 00:11:44.491 "product_name": "passthru", 00:11:44.491 "block_size": 512, 00:11:44.491 "num_blocks": 65536, 00:11:44.491 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:44.491 "assigned_rate_limits": { 00:11:44.491 "rw_ios_per_sec": 0, 00:11:44.491 "rw_mbytes_per_sec": 0, 00:11:44.491 "r_mbytes_per_sec": 0, 00:11:44.491 "w_mbytes_per_sec": 0 00:11:44.491 }, 00:11:44.491 "claimed": true, 00:11:44.491 "claim_type": "exclusive_write", 00:11:44.491 "zoned": false, 00:11:44.491 "supported_io_types": { 00:11:44.491 "read": true, 00:11:44.491 "write": true, 00:11:44.491 "unmap": true, 00:11:44.491 "flush": true, 00:11:44.491 "reset": true, 00:11:44.491 "nvme_admin": false, 00:11:44.491 "nvme_io": false, 00:11:44.491 "nvme_io_md": false, 00:11:44.491 "write_zeroes": true, 00:11:44.491 "zcopy": true, 00:11:44.491 "get_zone_info": false, 00:11:44.491 "zone_management": false, 00:11:44.491 "zone_append": false, 00:11:44.491 "compare": false, 00:11:44.491 "compare_and_write": false, 00:11:44.491 "abort": true, 00:11:44.491 "seek_hole": false, 00:11:44.491 "seek_data": false, 00:11:44.491 "copy": true, 00:11:44.491 "nvme_iov_md": false 00:11:44.491 }, 00:11:44.491 "memory_domains": [ 00:11:44.491 { 00:11:44.491 "dma_device_id": "system", 00:11:44.491 "dma_device_type": 1 00:11:44.491 }, 00:11:44.491 { 00:11:44.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.491 "dma_device_type": 2 00:11:44.491 } 00:11:44.491 ], 00:11:44.491 "driver_specific": { 00:11:44.491 "passthru": { 00:11:44.491 "name": "pt2", 00:11:44.491 "base_bdev_name": "malloc2" 00:11:44.491 } 00:11:44.491 } 00:11:44.491 }' 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:44.491 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.750 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.750 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:44.750 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.750 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.750 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:44.750 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:44.750 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:45.009 [2024-07-15 22:40:29.756757] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c1ce97da-79d4-43a8-a878-352fe1e117dd '!=' c1ce97da-79d4-43a8-a878-352fe1e117dd ']' 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2699800 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2699800 ']' 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2699800 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2699800 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2699800' 00:11:45.009 killing process with pid 2699800 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2699800 00:11:45.009 [2024-07-15 22:40:29.829977] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:45.009 [2024-07-15 22:40:29.830039] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:45.009 [2024-07-15 22:40:29.830083] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:45.009 [2024-07-15 22:40:29.830094] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe3eec0 name raid_bdev1, state offline 00:11:45.009 22:40:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2699800 00:11:45.009 [2024-07-15 22:40:29.849363] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:45.269 22:40:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:45.269 00:11:45.269 real 0m14.444s 00:11:45.269 user 0m26.204s 00:11:45.269 sys 0m2.412s 00:11:45.269 22:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:45.269 22:40:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.269 ************************************ 00:11:45.269 END TEST raid_superblock_test 00:11:45.269 ************************************ 00:11:45.269 22:40:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:45.269 22:40:30 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:45.269 22:40:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:45.269 22:40:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:45.269 22:40:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:45.269 ************************************ 00:11:45.269 START TEST raid_read_error_test 00:11:45.269 ************************************ 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ILJFv142Y1 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2701947 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2701947 /var/tmp/spdk-raid.sock 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2701947 ']' 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:45.269 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:45.269 22:40:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.528 [2024-07-15 22:40:30.235387] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:45.528 [2024-07-15 22:40:30.235459] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2701947 ] 00:11:45.528 [2024-07-15 22:40:30.366108] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:45.787 [2024-07-15 22:40:30.464360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:45.787 [2024-07-15 22:40:30.527801] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.787 [2024-07-15 22:40:30.527846] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:46.355 22:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:46.355 22:40:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:46.355 22:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:46.355 22:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:46.614 BaseBdev1_malloc 00:11:46.614 22:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:46.873 true 00:11:46.873 22:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:47.132 [2024-07-15 22:40:31.902379] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:47.132 [2024-07-15 22:40:31.902422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:47.132 [2024-07-15 22:40:31.902441] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x290a0d0 00:11:47.132 [2024-07-15 22:40:31.902453] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:47.132 [2024-07-15 22:40:31.904164] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:47.132 [2024-07-15 22:40:31.904191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:47.132 BaseBdev1 00:11:47.132 22:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:47.132 22:40:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:47.392 BaseBdev2_malloc 00:11:47.392 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:47.651 true 00:11:47.651 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:47.911 [2024-07-15 22:40:32.648838] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:47.911 [2024-07-15 22:40:32.648887] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:47.911 [2024-07-15 22:40:32.648906] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x290e910 00:11:47.911 [2024-07-15 22:40:32.648919] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:47.911 [2024-07-15 22:40:32.650364] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:47.911 [2024-07-15 22:40:32.650392] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:47.911 BaseBdev2 00:11:47.911 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:48.173 [2024-07-15 22:40:32.897520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:48.173 [2024-07-15 22:40:32.898760] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:48.173 [2024-07-15 22:40:32.898956] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2910320 00:11:48.173 [2024-07-15 22:40:32.898970] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:48.173 [2024-07-15 22:40:32.899156] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x290f270 00:11:48.173 [2024-07-15 22:40:32.899298] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2910320 00:11:48.173 [2024-07-15 22:40:32.899308] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2910320 00:11:48.173 [2024-07-15 22:40:32.899408] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.173 22:40:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:48.440 22:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.440 "name": "raid_bdev1", 00:11:48.440 "uuid": "6cc69b73-79d4-4067-8b45-9ebc8ca849d4", 00:11:48.440 "strip_size_kb": 64, 00:11:48.440 "state": "online", 00:11:48.440 "raid_level": "raid0", 00:11:48.440 "superblock": true, 00:11:48.440 "num_base_bdevs": 2, 00:11:48.440 "num_base_bdevs_discovered": 2, 00:11:48.440 "num_base_bdevs_operational": 2, 00:11:48.440 "base_bdevs_list": [ 00:11:48.440 { 00:11:48.440 "name": "BaseBdev1", 00:11:48.440 "uuid": "b3905bbf-7dcc-58cf-8b81-6aeeaaca8198", 00:11:48.440 "is_configured": true, 00:11:48.440 "data_offset": 2048, 00:11:48.440 "data_size": 63488 00:11:48.440 }, 00:11:48.440 { 00:11:48.440 "name": "BaseBdev2", 00:11:48.440 "uuid": "b4499b16-877a-590b-98f7-765e71af9a68", 00:11:48.440 "is_configured": true, 00:11:48.440 "data_offset": 2048, 00:11:48.440 "data_size": 63488 00:11:48.440 } 00:11:48.440 ] 00:11:48.440 }' 00:11:48.440 22:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.440 22:40:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.009 22:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:49.009 22:40:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:49.009 [2024-07-15 22:40:33.872403] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x290b9b0 00:11:49.984 22:40:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.244 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:50.503 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.503 "name": "raid_bdev1", 00:11:50.503 "uuid": "6cc69b73-79d4-4067-8b45-9ebc8ca849d4", 00:11:50.503 "strip_size_kb": 64, 00:11:50.503 "state": "online", 00:11:50.503 "raid_level": "raid0", 00:11:50.503 "superblock": true, 00:11:50.503 "num_base_bdevs": 2, 00:11:50.503 "num_base_bdevs_discovered": 2, 00:11:50.503 "num_base_bdevs_operational": 2, 00:11:50.503 "base_bdevs_list": [ 00:11:50.503 { 00:11:50.503 "name": "BaseBdev1", 00:11:50.503 "uuid": "b3905bbf-7dcc-58cf-8b81-6aeeaaca8198", 00:11:50.503 "is_configured": true, 00:11:50.503 "data_offset": 2048, 00:11:50.503 "data_size": 63488 00:11:50.503 }, 00:11:50.503 { 00:11:50.503 "name": "BaseBdev2", 00:11:50.503 "uuid": "b4499b16-877a-590b-98f7-765e71af9a68", 00:11:50.503 "is_configured": true, 00:11:50.503 "data_offset": 2048, 00:11:50.503 "data_size": 63488 00:11:50.503 } 00:11:50.503 ] 00:11:50.503 }' 00:11:50.503 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.503 22:40:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.072 22:40:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:51.331 [2024-07-15 22:40:36.105611] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:51.331 [2024-07-15 22:40:36.105657] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:51.331 [2024-07-15 22:40:36.108879] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:51.331 [2024-07-15 22:40:36.108910] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:51.331 [2024-07-15 22:40:36.108946] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:51.331 [2024-07-15 22:40:36.108958] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2910320 name raid_bdev1, state offline 00:11:51.331 0 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2701947 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2701947 ']' 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2701947 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2701947 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2701947' 00:11:51.331 killing process with pid 2701947 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2701947 00:11:51.331 [2024-07-15 22:40:36.190302] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:51.331 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2701947 00:11:51.331 [2024-07-15 22:40:36.200572] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ILJFv142Y1 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:51.590 00:11:51.590 real 0m6.263s 00:11:51.590 user 0m9.810s 00:11:51.590 sys 0m1.103s 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:51.590 22:40:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.590 ************************************ 00:11:51.590 END TEST raid_read_error_test 00:11:51.590 ************************************ 00:11:51.590 22:40:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:51.590 22:40:36 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:51.590 22:40:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:51.590 22:40:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:51.590 22:40:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:51.850 ************************************ 00:11:51.850 START TEST raid_write_error_test 00:11:51.850 ************************************ 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.70nXdjZjcZ 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2702800 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2702800 /var/tmp/spdk-raid.sock 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2702800 ']' 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:51.850 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:51.850 22:40:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.850 [2024-07-15 22:40:36.584557] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:51.850 [2024-07-15 22:40:36.584628] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2702800 ] 00:11:51.850 [2024-07-15 22:40:36.714773] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:52.109 [2024-07-15 22:40:36.817531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:52.109 [2024-07-15 22:40:36.886190] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:52.109 [2024-07-15 22:40:36.886239] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:52.676 22:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:52.676 22:40:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:11:52.676 22:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:52.676 22:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:52.935 BaseBdev1_malloc 00:11:52.935 22:40:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:53.193 true 00:11:53.193 22:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:53.451 [2024-07-15 22:40:38.232293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:53.451 [2024-07-15 22:40:38.232339] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:53.451 [2024-07-15 22:40:38.232360] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x286a0d0 00:11:53.451 [2024-07-15 22:40:38.232373] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:53.451 [2024-07-15 22:40:38.234154] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:53.451 [2024-07-15 22:40:38.234189] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:53.451 BaseBdev1 00:11:53.451 22:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:53.451 22:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:53.708 BaseBdev2_malloc 00:11:53.708 22:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:53.966 true 00:11:53.966 22:40:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:54.223 [2024-07-15 22:40:38.978809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:54.223 [2024-07-15 22:40:38.978852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:54.223 [2024-07-15 22:40:38.978870] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x286e910 00:11:54.223 [2024-07-15 22:40:38.978884] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:54.223 [2024-07-15 22:40:38.980251] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:54.223 [2024-07-15 22:40:38.980277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:54.223 BaseBdev2 00:11:54.223 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:54.481 [2024-07-15 22:40:39.219490] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:54.481 [2024-07-15 22:40:39.220757] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:54.481 [2024-07-15 22:40:39.220956] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2870320 00:11:54.481 [2024-07-15 22:40:39.220970] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:54.481 [2024-07-15 22:40:39.221158] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x286f270 00:11:54.481 [2024-07-15 22:40:39.221300] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2870320 00:11:54.481 [2024-07-15 22:40:39.221310] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2870320 00:11:54.481 [2024-07-15 22:40:39.221410] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.481 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:54.778 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.778 "name": "raid_bdev1", 00:11:54.778 "uuid": "149bbb65-ecb9-4b33-812d-77d912eed9eb", 00:11:54.778 "strip_size_kb": 64, 00:11:54.778 "state": "online", 00:11:54.778 "raid_level": "raid0", 00:11:54.778 "superblock": true, 00:11:54.778 "num_base_bdevs": 2, 00:11:54.778 "num_base_bdevs_discovered": 2, 00:11:54.778 "num_base_bdevs_operational": 2, 00:11:54.778 "base_bdevs_list": [ 00:11:54.778 { 00:11:54.778 "name": "BaseBdev1", 00:11:54.778 "uuid": "9529486d-744a-52c8-a38a-267740d5a08c", 00:11:54.778 "is_configured": true, 00:11:54.778 "data_offset": 2048, 00:11:54.778 "data_size": 63488 00:11:54.778 }, 00:11:54.778 { 00:11:54.778 "name": "BaseBdev2", 00:11:54.778 "uuid": "48f6c609-e824-5e17-be95-36d850129cf0", 00:11:54.778 "is_configured": true, 00:11:54.778 "data_offset": 2048, 00:11:54.778 "data_size": 63488 00:11:54.778 } 00:11:54.778 ] 00:11:54.778 }' 00:11:54.778 22:40:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.778 22:40:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.343 22:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:55.343 22:40:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:55.343 [2024-07-15 22:40:40.226505] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x286b9b0 00:11:56.275 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.533 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:56.791 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:56.791 "name": "raid_bdev1", 00:11:56.791 "uuid": "149bbb65-ecb9-4b33-812d-77d912eed9eb", 00:11:56.791 "strip_size_kb": 64, 00:11:56.791 "state": "online", 00:11:56.791 "raid_level": "raid0", 00:11:56.791 "superblock": true, 00:11:56.791 "num_base_bdevs": 2, 00:11:56.791 "num_base_bdevs_discovered": 2, 00:11:56.791 "num_base_bdevs_operational": 2, 00:11:56.791 "base_bdevs_list": [ 00:11:56.791 { 00:11:56.791 "name": "BaseBdev1", 00:11:56.791 "uuid": "9529486d-744a-52c8-a38a-267740d5a08c", 00:11:56.791 "is_configured": true, 00:11:56.791 "data_offset": 2048, 00:11:56.791 "data_size": 63488 00:11:56.791 }, 00:11:56.791 { 00:11:56.791 "name": "BaseBdev2", 00:11:56.791 "uuid": "48f6c609-e824-5e17-be95-36d850129cf0", 00:11:56.791 "is_configured": true, 00:11:56.791 "data_offset": 2048, 00:11:56.791 "data_size": 63488 00:11:56.791 } 00:11:56.791 ] 00:11:56.791 }' 00:11:56.791 22:40:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:56.791 22:40:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.359 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:57.618 [2024-07-15 22:40:42.444015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:57.618 [2024-07-15 22:40:42.444054] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:57.618 [2024-07-15 22:40:42.447220] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:57.618 [2024-07-15 22:40:42.447250] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:57.618 [2024-07-15 22:40:42.447278] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:57.618 [2024-07-15 22:40:42.447289] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2870320 name raid_bdev1, state offline 00:11:57.618 0 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2702800 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2702800 ']' 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2702800 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2702800 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2702800' 00:11:57.618 killing process with pid 2702800 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2702800 00:11:57.618 [2024-07-15 22:40:42.506582] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:57.618 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2702800 00:11:57.618 [2024-07-15 22:40:42.516918] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.70nXdjZjcZ 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:57.877 00:11:57.877 real 0m6.234s 00:11:57.877 user 0m9.781s 00:11:57.877 sys 0m1.069s 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:57.877 22:40:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.877 ************************************ 00:11:57.877 END TEST raid_write_error_test 00:11:57.877 ************************************ 00:11:57.877 22:40:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:57.877 22:40:42 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:57.877 22:40:42 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:57.877 22:40:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:57.877 22:40:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:57.877 22:40:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:58.137 ************************************ 00:11:58.137 START TEST raid_state_function_test 00:11:58.137 ************************************ 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2703730 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2703730' 00:11:58.137 Process raid pid: 2703730 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2703730 /var/tmp/spdk-raid.sock 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2703730 ']' 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:58.137 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:58.137 22:40:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.137 [2024-07-15 22:40:42.900339] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:11:58.137 [2024-07-15 22:40:42.900411] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:58.138 [2024-07-15 22:40:43.033001] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:58.396 [2024-07-15 22:40:43.135268] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:58.396 [2024-07-15 22:40:43.196353] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:58.396 [2024-07-15 22:40:43.196386] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:58.963 22:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:58.963 22:40:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:58.963 22:40:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:59.222 [2024-07-15 22:40:43.991221] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:59.222 [2024-07-15 22:40:43.991266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:59.222 [2024-07-15 22:40:43.991278] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:59.222 [2024-07-15 22:40:43.991289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.222 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:59.480 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.480 "name": "Existed_Raid", 00:11:59.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:59.480 "strip_size_kb": 64, 00:11:59.480 "state": "configuring", 00:11:59.480 "raid_level": "concat", 00:11:59.480 "superblock": false, 00:11:59.480 "num_base_bdevs": 2, 00:11:59.480 "num_base_bdevs_discovered": 0, 00:11:59.480 "num_base_bdevs_operational": 2, 00:11:59.480 "base_bdevs_list": [ 00:11:59.481 { 00:11:59.481 "name": "BaseBdev1", 00:11:59.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:59.481 "is_configured": false, 00:11:59.481 "data_offset": 0, 00:11:59.481 "data_size": 0 00:11:59.481 }, 00:11:59.481 { 00:11:59.481 "name": "BaseBdev2", 00:11:59.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:59.481 "is_configured": false, 00:11:59.481 "data_offset": 0, 00:11:59.481 "data_size": 0 00:11:59.481 } 00:11:59.481 ] 00:11:59.481 }' 00:11:59.481 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.481 22:40:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.048 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:00.048 [2024-07-15 22:40:44.885469] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:00.048 [2024-07-15 22:40:44.885499] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x173fa80 name Existed_Raid, state configuring 00:12:00.048 22:40:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:00.308 [2024-07-15 22:40:45.053936] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:00.308 [2024-07-15 22:40:45.053962] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:00.308 [2024-07-15 22:40:45.053972] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:00.308 [2024-07-15 22:40:45.053989] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:00.308 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:00.568 [2024-07-15 22:40:45.240329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:00.568 BaseBdev1 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:00.568 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:00.828 [ 00:12:00.828 { 00:12:00.828 "name": "BaseBdev1", 00:12:00.828 "aliases": [ 00:12:00.828 "f32c5b67-ffa3-4673-88ca-2796ab1f7245" 00:12:00.828 ], 00:12:00.828 "product_name": "Malloc disk", 00:12:00.828 "block_size": 512, 00:12:00.828 "num_blocks": 65536, 00:12:00.828 "uuid": "f32c5b67-ffa3-4673-88ca-2796ab1f7245", 00:12:00.828 "assigned_rate_limits": { 00:12:00.828 "rw_ios_per_sec": 0, 00:12:00.828 "rw_mbytes_per_sec": 0, 00:12:00.828 "r_mbytes_per_sec": 0, 00:12:00.828 "w_mbytes_per_sec": 0 00:12:00.828 }, 00:12:00.828 "claimed": true, 00:12:00.828 "claim_type": "exclusive_write", 00:12:00.828 "zoned": false, 00:12:00.828 "supported_io_types": { 00:12:00.828 "read": true, 00:12:00.828 "write": true, 00:12:00.828 "unmap": true, 00:12:00.828 "flush": true, 00:12:00.828 "reset": true, 00:12:00.828 "nvme_admin": false, 00:12:00.828 "nvme_io": false, 00:12:00.828 "nvme_io_md": false, 00:12:00.828 "write_zeroes": true, 00:12:00.828 "zcopy": true, 00:12:00.828 "get_zone_info": false, 00:12:00.828 "zone_management": false, 00:12:00.828 "zone_append": false, 00:12:00.828 "compare": false, 00:12:00.828 "compare_and_write": false, 00:12:00.828 "abort": true, 00:12:00.828 "seek_hole": false, 00:12:00.828 "seek_data": false, 00:12:00.828 "copy": true, 00:12:00.828 "nvme_iov_md": false 00:12:00.828 }, 00:12:00.828 "memory_domains": [ 00:12:00.828 { 00:12:00.828 "dma_device_id": "system", 00:12:00.828 "dma_device_type": 1 00:12:00.828 }, 00:12:00.828 { 00:12:00.828 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.828 "dma_device_type": 2 00:12:00.828 } 00:12:00.828 ], 00:12:00.828 "driver_specific": {} 00:12:00.828 } 00:12:00.828 ] 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.828 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:01.087 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.087 "name": "Existed_Raid", 00:12:01.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.087 "strip_size_kb": 64, 00:12:01.087 "state": "configuring", 00:12:01.087 "raid_level": "concat", 00:12:01.087 "superblock": false, 00:12:01.087 "num_base_bdevs": 2, 00:12:01.087 "num_base_bdevs_discovered": 1, 00:12:01.087 "num_base_bdevs_operational": 2, 00:12:01.087 "base_bdevs_list": [ 00:12:01.087 { 00:12:01.088 "name": "BaseBdev1", 00:12:01.088 "uuid": "f32c5b67-ffa3-4673-88ca-2796ab1f7245", 00:12:01.088 "is_configured": true, 00:12:01.088 "data_offset": 0, 00:12:01.088 "data_size": 65536 00:12:01.088 }, 00:12:01.088 { 00:12:01.088 "name": "BaseBdev2", 00:12:01.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:01.088 "is_configured": false, 00:12:01.088 "data_offset": 0, 00:12:01.088 "data_size": 0 00:12:01.088 } 00:12:01.088 ] 00:12:01.088 }' 00:12:01.088 22:40:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.088 22:40:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.656 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:01.915 [2024-07-15 22:40:46.623989] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:01.915 [2024-07-15 22:40:46.624030] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x173f350 name Existed_Raid, state configuring 00:12:01.915 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:02.174 [2024-07-15 22:40:46.872680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:02.174 [2024-07-15 22:40:46.874374] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:02.174 [2024-07-15 22:40:46.874409] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.174 22:40:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.433 22:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.433 "name": "Existed_Raid", 00:12:02.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.433 "strip_size_kb": 64, 00:12:02.433 "state": "configuring", 00:12:02.433 "raid_level": "concat", 00:12:02.433 "superblock": false, 00:12:02.433 "num_base_bdevs": 2, 00:12:02.433 "num_base_bdevs_discovered": 1, 00:12:02.433 "num_base_bdevs_operational": 2, 00:12:02.433 "base_bdevs_list": [ 00:12:02.433 { 00:12:02.433 "name": "BaseBdev1", 00:12:02.433 "uuid": "f32c5b67-ffa3-4673-88ca-2796ab1f7245", 00:12:02.433 "is_configured": true, 00:12:02.433 "data_offset": 0, 00:12:02.433 "data_size": 65536 00:12:02.433 }, 00:12:02.433 { 00:12:02.433 "name": "BaseBdev2", 00:12:02.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:02.433 "is_configured": false, 00:12:02.433 "data_offset": 0, 00:12:02.433 "data_size": 0 00:12:02.433 } 00:12:02.433 ] 00:12:02.433 }' 00:12:02.433 22:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.433 22:40:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:03.001 22:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:03.260 [2024-07-15 22:40:47.980236] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:03.260 [2024-07-15 22:40:47.980279] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1740000 00:12:03.260 [2024-07-15 22:40:47.980288] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:03.260 [2024-07-15 22:40:47.980482] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x165a0c0 00:12:03.260 [2024-07-15 22:40:47.980605] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1740000 00:12:03.260 [2024-07-15 22:40:47.980616] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1740000 00:12:03.260 [2024-07-15 22:40:47.980790] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:03.260 BaseBdev2 00:12:03.260 22:40:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:03.260 22:40:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:03.260 22:40:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:03.260 22:40:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:03.260 22:40:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:03.260 22:40:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:03.260 22:40:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:03.520 22:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:03.779 [ 00:12:03.779 { 00:12:03.779 "name": "BaseBdev2", 00:12:03.779 "aliases": [ 00:12:03.779 "b2fdf530-b75c-4c0e-81d6-130fed4a7fb6" 00:12:03.779 ], 00:12:03.779 "product_name": "Malloc disk", 00:12:03.779 "block_size": 512, 00:12:03.779 "num_blocks": 65536, 00:12:03.779 "uuid": "b2fdf530-b75c-4c0e-81d6-130fed4a7fb6", 00:12:03.779 "assigned_rate_limits": { 00:12:03.779 "rw_ios_per_sec": 0, 00:12:03.779 "rw_mbytes_per_sec": 0, 00:12:03.779 "r_mbytes_per_sec": 0, 00:12:03.779 "w_mbytes_per_sec": 0 00:12:03.779 }, 00:12:03.779 "claimed": true, 00:12:03.779 "claim_type": "exclusive_write", 00:12:03.779 "zoned": false, 00:12:03.779 "supported_io_types": { 00:12:03.779 "read": true, 00:12:03.779 "write": true, 00:12:03.779 "unmap": true, 00:12:03.779 "flush": true, 00:12:03.779 "reset": true, 00:12:03.779 "nvme_admin": false, 00:12:03.779 "nvme_io": false, 00:12:03.779 "nvme_io_md": false, 00:12:03.779 "write_zeroes": true, 00:12:03.779 "zcopy": true, 00:12:03.779 "get_zone_info": false, 00:12:03.779 "zone_management": false, 00:12:03.779 "zone_append": false, 00:12:03.779 "compare": false, 00:12:03.779 "compare_and_write": false, 00:12:03.779 "abort": true, 00:12:03.779 "seek_hole": false, 00:12:03.779 "seek_data": false, 00:12:03.779 "copy": true, 00:12:03.779 "nvme_iov_md": false 00:12:03.779 }, 00:12:03.779 "memory_domains": [ 00:12:03.779 { 00:12:03.779 "dma_device_id": "system", 00:12:03.779 "dma_device_type": 1 00:12:03.779 }, 00:12:03.779 { 00:12:03.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.779 "dma_device_type": 2 00:12:03.779 } 00:12:03.779 ], 00:12:03.779 "driver_specific": {} 00:12:03.779 } 00:12:03.779 ] 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.779 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:04.078 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:04.078 "name": "Existed_Raid", 00:12:04.078 "uuid": "d7d66390-4dbf-45c9-a1a5-485081cc492b", 00:12:04.078 "strip_size_kb": 64, 00:12:04.078 "state": "online", 00:12:04.078 "raid_level": "concat", 00:12:04.078 "superblock": false, 00:12:04.078 "num_base_bdevs": 2, 00:12:04.078 "num_base_bdevs_discovered": 2, 00:12:04.078 "num_base_bdevs_operational": 2, 00:12:04.078 "base_bdevs_list": [ 00:12:04.078 { 00:12:04.078 "name": "BaseBdev1", 00:12:04.078 "uuid": "f32c5b67-ffa3-4673-88ca-2796ab1f7245", 00:12:04.078 "is_configured": true, 00:12:04.078 "data_offset": 0, 00:12:04.078 "data_size": 65536 00:12:04.078 }, 00:12:04.078 { 00:12:04.078 "name": "BaseBdev2", 00:12:04.078 "uuid": "b2fdf530-b75c-4c0e-81d6-130fed4a7fb6", 00:12:04.078 "is_configured": true, 00:12:04.078 "data_offset": 0, 00:12:04.078 "data_size": 65536 00:12:04.078 } 00:12:04.078 ] 00:12:04.078 }' 00:12:04.078 22:40:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:04.078 22:40:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:04.652 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:04.652 [2024-07-15 22:40:49.560697] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:04.911 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:04.911 "name": "Existed_Raid", 00:12:04.911 "aliases": [ 00:12:04.911 "d7d66390-4dbf-45c9-a1a5-485081cc492b" 00:12:04.911 ], 00:12:04.911 "product_name": "Raid Volume", 00:12:04.911 "block_size": 512, 00:12:04.911 "num_blocks": 131072, 00:12:04.911 "uuid": "d7d66390-4dbf-45c9-a1a5-485081cc492b", 00:12:04.911 "assigned_rate_limits": { 00:12:04.911 "rw_ios_per_sec": 0, 00:12:04.911 "rw_mbytes_per_sec": 0, 00:12:04.911 "r_mbytes_per_sec": 0, 00:12:04.911 "w_mbytes_per_sec": 0 00:12:04.911 }, 00:12:04.911 "claimed": false, 00:12:04.911 "zoned": false, 00:12:04.911 "supported_io_types": { 00:12:04.911 "read": true, 00:12:04.911 "write": true, 00:12:04.911 "unmap": true, 00:12:04.911 "flush": true, 00:12:04.911 "reset": true, 00:12:04.911 "nvme_admin": false, 00:12:04.911 "nvme_io": false, 00:12:04.911 "nvme_io_md": false, 00:12:04.911 "write_zeroes": true, 00:12:04.911 "zcopy": false, 00:12:04.911 "get_zone_info": false, 00:12:04.911 "zone_management": false, 00:12:04.911 "zone_append": false, 00:12:04.911 "compare": false, 00:12:04.911 "compare_and_write": false, 00:12:04.911 "abort": false, 00:12:04.911 "seek_hole": false, 00:12:04.911 "seek_data": false, 00:12:04.911 "copy": false, 00:12:04.911 "nvme_iov_md": false 00:12:04.911 }, 00:12:04.911 "memory_domains": [ 00:12:04.911 { 00:12:04.911 "dma_device_id": "system", 00:12:04.911 "dma_device_type": 1 00:12:04.911 }, 00:12:04.911 { 00:12:04.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.911 "dma_device_type": 2 00:12:04.911 }, 00:12:04.911 { 00:12:04.911 "dma_device_id": "system", 00:12:04.912 "dma_device_type": 1 00:12:04.912 }, 00:12:04.912 { 00:12:04.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.912 "dma_device_type": 2 00:12:04.912 } 00:12:04.912 ], 00:12:04.912 "driver_specific": { 00:12:04.912 "raid": { 00:12:04.912 "uuid": "d7d66390-4dbf-45c9-a1a5-485081cc492b", 00:12:04.912 "strip_size_kb": 64, 00:12:04.912 "state": "online", 00:12:04.912 "raid_level": "concat", 00:12:04.912 "superblock": false, 00:12:04.912 "num_base_bdevs": 2, 00:12:04.912 "num_base_bdevs_discovered": 2, 00:12:04.912 "num_base_bdevs_operational": 2, 00:12:04.912 "base_bdevs_list": [ 00:12:04.912 { 00:12:04.912 "name": "BaseBdev1", 00:12:04.912 "uuid": "f32c5b67-ffa3-4673-88ca-2796ab1f7245", 00:12:04.912 "is_configured": true, 00:12:04.912 "data_offset": 0, 00:12:04.912 "data_size": 65536 00:12:04.912 }, 00:12:04.912 { 00:12:04.912 "name": "BaseBdev2", 00:12:04.912 "uuid": "b2fdf530-b75c-4c0e-81d6-130fed4a7fb6", 00:12:04.912 "is_configured": true, 00:12:04.912 "data_offset": 0, 00:12:04.912 "data_size": 65536 00:12:04.912 } 00:12:04.912 ] 00:12:04.912 } 00:12:04.912 } 00:12:04.912 }' 00:12:04.912 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:04.912 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:04.912 BaseBdev2' 00:12:04.912 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.912 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:04.912 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:05.171 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:05.171 "name": "BaseBdev1", 00:12:05.171 "aliases": [ 00:12:05.171 "f32c5b67-ffa3-4673-88ca-2796ab1f7245" 00:12:05.171 ], 00:12:05.171 "product_name": "Malloc disk", 00:12:05.171 "block_size": 512, 00:12:05.171 "num_blocks": 65536, 00:12:05.171 "uuid": "f32c5b67-ffa3-4673-88ca-2796ab1f7245", 00:12:05.171 "assigned_rate_limits": { 00:12:05.171 "rw_ios_per_sec": 0, 00:12:05.171 "rw_mbytes_per_sec": 0, 00:12:05.171 "r_mbytes_per_sec": 0, 00:12:05.171 "w_mbytes_per_sec": 0 00:12:05.171 }, 00:12:05.171 "claimed": true, 00:12:05.171 "claim_type": "exclusive_write", 00:12:05.171 "zoned": false, 00:12:05.171 "supported_io_types": { 00:12:05.171 "read": true, 00:12:05.171 "write": true, 00:12:05.171 "unmap": true, 00:12:05.171 "flush": true, 00:12:05.171 "reset": true, 00:12:05.171 "nvme_admin": false, 00:12:05.171 "nvme_io": false, 00:12:05.171 "nvme_io_md": false, 00:12:05.171 "write_zeroes": true, 00:12:05.171 "zcopy": true, 00:12:05.171 "get_zone_info": false, 00:12:05.171 "zone_management": false, 00:12:05.171 "zone_append": false, 00:12:05.171 "compare": false, 00:12:05.171 "compare_and_write": false, 00:12:05.171 "abort": true, 00:12:05.171 "seek_hole": false, 00:12:05.171 "seek_data": false, 00:12:05.171 "copy": true, 00:12:05.171 "nvme_iov_md": false 00:12:05.171 }, 00:12:05.171 "memory_domains": [ 00:12:05.171 { 00:12:05.171 "dma_device_id": "system", 00:12:05.171 "dma_device_type": 1 00:12:05.171 }, 00:12:05.171 { 00:12:05.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.171 "dma_device_type": 2 00:12:05.171 } 00:12:05.171 ], 00:12:05.171 "driver_specific": {} 00:12:05.171 }' 00:12:05.171 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.171 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.171 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:05.171 22:40:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.171 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.171 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:05.171 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:05.429 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:05.687 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:05.687 "name": "BaseBdev2", 00:12:05.687 "aliases": [ 00:12:05.687 "b2fdf530-b75c-4c0e-81d6-130fed4a7fb6" 00:12:05.687 ], 00:12:05.687 "product_name": "Malloc disk", 00:12:05.687 "block_size": 512, 00:12:05.687 "num_blocks": 65536, 00:12:05.687 "uuid": "b2fdf530-b75c-4c0e-81d6-130fed4a7fb6", 00:12:05.687 "assigned_rate_limits": { 00:12:05.687 "rw_ios_per_sec": 0, 00:12:05.687 "rw_mbytes_per_sec": 0, 00:12:05.687 "r_mbytes_per_sec": 0, 00:12:05.687 "w_mbytes_per_sec": 0 00:12:05.687 }, 00:12:05.687 "claimed": true, 00:12:05.687 "claim_type": "exclusive_write", 00:12:05.687 "zoned": false, 00:12:05.687 "supported_io_types": { 00:12:05.687 "read": true, 00:12:05.687 "write": true, 00:12:05.687 "unmap": true, 00:12:05.687 "flush": true, 00:12:05.687 "reset": true, 00:12:05.687 "nvme_admin": false, 00:12:05.687 "nvme_io": false, 00:12:05.687 "nvme_io_md": false, 00:12:05.687 "write_zeroes": true, 00:12:05.687 "zcopy": true, 00:12:05.687 "get_zone_info": false, 00:12:05.687 "zone_management": false, 00:12:05.687 "zone_append": false, 00:12:05.687 "compare": false, 00:12:05.687 "compare_and_write": false, 00:12:05.687 "abort": true, 00:12:05.687 "seek_hole": false, 00:12:05.687 "seek_data": false, 00:12:05.687 "copy": true, 00:12:05.687 "nvme_iov_md": false 00:12:05.687 }, 00:12:05.687 "memory_domains": [ 00:12:05.687 { 00:12:05.687 "dma_device_id": "system", 00:12:05.687 "dma_device_type": 1 00:12:05.687 }, 00:12:05.687 { 00:12:05.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:05.687 "dma_device_type": 2 00:12:05.687 } 00:12:05.687 ], 00:12:05.687 "driver_specific": {} 00:12:05.687 }' 00:12:05.687 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.687 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:05.946 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.205 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:06.205 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:06.205 22:40:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:06.464 [2024-07-15 22:40:51.160721] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:06.464 [2024-07-15 22:40:51.160747] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:06.464 [2024-07-15 22:40:51.160789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:06.464 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:06.464 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.465 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:06.725 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.725 "name": "Existed_Raid", 00:12:06.725 "uuid": "d7d66390-4dbf-45c9-a1a5-485081cc492b", 00:12:06.725 "strip_size_kb": 64, 00:12:06.725 "state": "offline", 00:12:06.725 "raid_level": "concat", 00:12:06.725 "superblock": false, 00:12:06.725 "num_base_bdevs": 2, 00:12:06.725 "num_base_bdevs_discovered": 1, 00:12:06.725 "num_base_bdevs_operational": 1, 00:12:06.725 "base_bdevs_list": [ 00:12:06.725 { 00:12:06.725 "name": null, 00:12:06.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.725 "is_configured": false, 00:12:06.725 "data_offset": 0, 00:12:06.725 "data_size": 65536 00:12:06.725 }, 00:12:06.725 { 00:12:06.725 "name": "BaseBdev2", 00:12:06.725 "uuid": "b2fdf530-b75c-4c0e-81d6-130fed4a7fb6", 00:12:06.725 "is_configured": true, 00:12:06.725 "data_offset": 0, 00:12:06.725 "data_size": 65536 00:12:06.725 } 00:12:06.725 ] 00:12:06.725 }' 00:12:06.725 22:40:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.725 22:40:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.294 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:07.294 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:07.294 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.294 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:07.552 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:07.552 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:07.552 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:07.811 [2024-07-15 22:40:52.497261] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:07.811 [2024-07-15 22:40:52.497315] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1740000 name Existed_Raid, state offline 00:12:07.811 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:07.811 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:07.811 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.811 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2703730 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2703730 ']' 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2703730 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2703730 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:08.071 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:08.072 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2703730' 00:12:08.072 killing process with pid 2703730 00:12:08.072 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2703730 00:12:08.072 [2024-07-15 22:40:52.877455] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:08.072 22:40:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2703730 00:12:08.072 [2024-07-15 22:40:52.878442] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:08.331 22:40:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:08.331 00:12:08.331 real 0m10.277s 00:12:08.331 user 0m18.240s 00:12:08.331 sys 0m1.939s 00:12:08.331 22:40:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:08.331 22:40:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:08.331 ************************************ 00:12:08.331 END TEST raid_state_function_test 00:12:08.331 ************************************ 00:12:08.331 22:40:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:08.331 22:40:53 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:08.331 22:40:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:08.331 22:40:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:08.332 22:40:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:08.332 ************************************ 00:12:08.332 START TEST raid_state_function_test_sb 00:12:08.332 ************************************ 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2705360 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2705360' 00:12:08.332 Process raid pid: 2705360 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2705360 /var/tmp/spdk-raid.sock 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2705360 ']' 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:08.332 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:08.332 22:40:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:08.590 [2024-07-15 22:40:53.255490] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:12:08.590 [2024-07-15 22:40:53.255560] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:08.590 [2024-07-15 22:40:53.395582] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:08.590 [2024-07-15 22:40:53.498226] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:08.847 [2024-07-15 22:40:53.555409] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:08.847 [2024-07-15 22:40:53.555442] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:09.413 22:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:09.413 22:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:09.413 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:09.672 [2024-07-15 22:40:54.400194] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:09.672 [2024-07-15 22:40:54.400236] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:09.672 [2024-07-15 22:40:54.400247] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:09.672 [2024-07-15 22:40:54.400259] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.672 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.930 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.930 "name": "Existed_Raid", 00:12:09.930 "uuid": "3d94d57d-4d84-452c-ae26-d0b01e077423", 00:12:09.930 "strip_size_kb": 64, 00:12:09.930 "state": "configuring", 00:12:09.930 "raid_level": "concat", 00:12:09.930 "superblock": true, 00:12:09.930 "num_base_bdevs": 2, 00:12:09.930 "num_base_bdevs_discovered": 0, 00:12:09.930 "num_base_bdevs_operational": 2, 00:12:09.930 "base_bdevs_list": [ 00:12:09.930 { 00:12:09.930 "name": "BaseBdev1", 00:12:09.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.930 "is_configured": false, 00:12:09.930 "data_offset": 0, 00:12:09.930 "data_size": 0 00:12:09.930 }, 00:12:09.930 { 00:12:09.930 "name": "BaseBdev2", 00:12:09.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.930 "is_configured": false, 00:12:09.930 "data_offset": 0, 00:12:09.930 "data_size": 0 00:12:09.930 } 00:12:09.930 ] 00:12:09.930 }' 00:12:09.930 22:40:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.930 22:40:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:10.866 22:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:11.125 [2024-07-15 22:40:55.940078] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:11.125 [2024-07-15 22:40:55.940109] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eb3a80 name Existed_Raid, state configuring 00:12:11.125 22:40:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:11.383 [2024-07-15 22:40:56.184749] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:11.383 [2024-07-15 22:40:56.184779] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:11.383 [2024-07-15 22:40:56.184789] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:11.383 [2024-07-15 22:40:56.184800] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:11.383 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:11.642 [2024-07-15 22:40:56.439230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:11.642 BaseBdev1 00:12:11.642 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:11.642 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:11.642 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:11.642 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:11.642 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:11.642 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:11.642 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:11.901 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:12.159 [ 00:12:12.159 { 00:12:12.159 "name": "BaseBdev1", 00:12:12.159 "aliases": [ 00:12:12.159 "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d" 00:12:12.159 ], 00:12:12.159 "product_name": "Malloc disk", 00:12:12.159 "block_size": 512, 00:12:12.159 "num_blocks": 65536, 00:12:12.159 "uuid": "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d", 00:12:12.159 "assigned_rate_limits": { 00:12:12.159 "rw_ios_per_sec": 0, 00:12:12.159 "rw_mbytes_per_sec": 0, 00:12:12.159 "r_mbytes_per_sec": 0, 00:12:12.159 "w_mbytes_per_sec": 0 00:12:12.159 }, 00:12:12.159 "claimed": true, 00:12:12.159 "claim_type": "exclusive_write", 00:12:12.159 "zoned": false, 00:12:12.159 "supported_io_types": { 00:12:12.159 "read": true, 00:12:12.159 "write": true, 00:12:12.159 "unmap": true, 00:12:12.159 "flush": true, 00:12:12.159 "reset": true, 00:12:12.159 "nvme_admin": false, 00:12:12.159 "nvme_io": false, 00:12:12.159 "nvme_io_md": false, 00:12:12.159 "write_zeroes": true, 00:12:12.159 "zcopy": true, 00:12:12.159 "get_zone_info": false, 00:12:12.159 "zone_management": false, 00:12:12.159 "zone_append": false, 00:12:12.159 "compare": false, 00:12:12.159 "compare_and_write": false, 00:12:12.159 "abort": true, 00:12:12.159 "seek_hole": false, 00:12:12.159 "seek_data": false, 00:12:12.159 "copy": true, 00:12:12.159 "nvme_iov_md": false 00:12:12.159 }, 00:12:12.159 "memory_domains": [ 00:12:12.159 { 00:12:12.159 "dma_device_id": "system", 00:12:12.159 "dma_device_type": 1 00:12:12.159 }, 00:12:12.159 { 00:12:12.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.159 "dma_device_type": 2 00:12:12.159 } 00:12:12.159 ], 00:12:12.159 "driver_specific": {} 00:12:12.159 } 00:12:12.159 ] 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.159 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.160 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.160 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.160 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.160 22:40:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:12.417 22:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.418 "name": "Existed_Raid", 00:12:12.418 "uuid": "2286936b-31e1-437d-b53d-b18cd4c73e0d", 00:12:12.418 "strip_size_kb": 64, 00:12:12.418 "state": "configuring", 00:12:12.418 "raid_level": "concat", 00:12:12.418 "superblock": true, 00:12:12.418 "num_base_bdevs": 2, 00:12:12.418 "num_base_bdevs_discovered": 1, 00:12:12.418 "num_base_bdevs_operational": 2, 00:12:12.418 "base_bdevs_list": [ 00:12:12.418 { 00:12:12.418 "name": "BaseBdev1", 00:12:12.418 "uuid": "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d", 00:12:12.418 "is_configured": true, 00:12:12.418 "data_offset": 2048, 00:12:12.418 "data_size": 63488 00:12:12.418 }, 00:12:12.418 { 00:12:12.418 "name": "BaseBdev2", 00:12:12.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.418 "is_configured": false, 00:12:12.418 "data_offset": 0, 00:12:12.418 "data_size": 0 00:12:12.418 } 00:12:12.418 ] 00:12:12.418 }' 00:12:12.418 22:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.418 22:40:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:12.983 22:40:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:13.240 [2024-07-15 22:40:58.067524] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:13.240 [2024-07-15 22:40:58.067561] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eb3350 name Existed_Raid, state configuring 00:12:13.240 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:13.496 [2024-07-15 22:40:58.312229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:13.496 [2024-07-15 22:40:58.313717] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:13.496 [2024-07-15 22:40:58.313749] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.497 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:13.754 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.754 "name": "Existed_Raid", 00:12:13.754 "uuid": "7a228447-6eeb-40a9-a218-5ca2ef692437", 00:12:13.754 "strip_size_kb": 64, 00:12:13.754 "state": "configuring", 00:12:13.754 "raid_level": "concat", 00:12:13.754 "superblock": true, 00:12:13.754 "num_base_bdevs": 2, 00:12:13.754 "num_base_bdevs_discovered": 1, 00:12:13.754 "num_base_bdevs_operational": 2, 00:12:13.754 "base_bdevs_list": [ 00:12:13.754 { 00:12:13.754 "name": "BaseBdev1", 00:12:13.754 "uuid": "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d", 00:12:13.754 "is_configured": true, 00:12:13.754 "data_offset": 2048, 00:12:13.754 "data_size": 63488 00:12:13.754 }, 00:12:13.754 { 00:12:13.754 "name": "BaseBdev2", 00:12:13.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.754 "is_configured": false, 00:12:13.754 "data_offset": 0, 00:12:13.754 "data_size": 0 00:12:13.754 } 00:12:13.754 ] 00:12:13.754 }' 00:12:13.754 22:40:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.754 22:40:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:14.317 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:14.575 [2024-07-15 22:40:59.394412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:14.575 [2024-07-15 22:40:59.394558] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1eb4000 00:12:14.575 [2024-07-15 22:40:59.394572] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:14.575 [2024-07-15 22:40:59.394744] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dce0c0 00:12:14.575 [2024-07-15 22:40:59.394857] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1eb4000 00:12:14.575 [2024-07-15 22:40:59.394867] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1eb4000 00:12:14.575 [2024-07-15 22:40:59.394965] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:14.575 BaseBdev2 00:12:14.575 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:14.575 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:14.575 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:14.575 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:14.575 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:14.575 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:14.575 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:14.832 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:15.090 [ 00:12:15.090 { 00:12:15.090 "name": "BaseBdev2", 00:12:15.090 "aliases": [ 00:12:15.090 "064a62d5-dd6b-493d-aa78-ec1528155828" 00:12:15.090 ], 00:12:15.090 "product_name": "Malloc disk", 00:12:15.090 "block_size": 512, 00:12:15.090 "num_blocks": 65536, 00:12:15.090 "uuid": "064a62d5-dd6b-493d-aa78-ec1528155828", 00:12:15.090 "assigned_rate_limits": { 00:12:15.090 "rw_ios_per_sec": 0, 00:12:15.090 "rw_mbytes_per_sec": 0, 00:12:15.090 "r_mbytes_per_sec": 0, 00:12:15.090 "w_mbytes_per_sec": 0 00:12:15.090 }, 00:12:15.090 "claimed": true, 00:12:15.090 "claim_type": "exclusive_write", 00:12:15.090 "zoned": false, 00:12:15.090 "supported_io_types": { 00:12:15.090 "read": true, 00:12:15.090 "write": true, 00:12:15.090 "unmap": true, 00:12:15.090 "flush": true, 00:12:15.090 "reset": true, 00:12:15.090 "nvme_admin": false, 00:12:15.090 "nvme_io": false, 00:12:15.090 "nvme_io_md": false, 00:12:15.090 "write_zeroes": true, 00:12:15.090 "zcopy": true, 00:12:15.090 "get_zone_info": false, 00:12:15.090 "zone_management": false, 00:12:15.090 "zone_append": false, 00:12:15.090 "compare": false, 00:12:15.090 "compare_and_write": false, 00:12:15.090 "abort": true, 00:12:15.091 "seek_hole": false, 00:12:15.091 "seek_data": false, 00:12:15.091 "copy": true, 00:12:15.091 "nvme_iov_md": false 00:12:15.091 }, 00:12:15.091 "memory_domains": [ 00:12:15.091 { 00:12:15.091 "dma_device_id": "system", 00:12:15.091 "dma_device_type": 1 00:12:15.091 }, 00:12:15.091 { 00:12:15.091 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.091 "dma_device_type": 2 00:12:15.091 } 00:12:15.091 ], 00:12:15.091 "driver_specific": {} 00:12:15.091 } 00:12:15.091 ] 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.091 22:40:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:15.349 22:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:15.349 "name": "Existed_Raid", 00:12:15.349 "uuid": "7a228447-6eeb-40a9-a218-5ca2ef692437", 00:12:15.349 "strip_size_kb": 64, 00:12:15.349 "state": "online", 00:12:15.349 "raid_level": "concat", 00:12:15.349 "superblock": true, 00:12:15.349 "num_base_bdevs": 2, 00:12:15.349 "num_base_bdevs_discovered": 2, 00:12:15.349 "num_base_bdevs_operational": 2, 00:12:15.349 "base_bdevs_list": [ 00:12:15.349 { 00:12:15.349 "name": "BaseBdev1", 00:12:15.349 "uuid": "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d", 00:12:15.349 "is_configured": true, 00:12:15.349 "data_offset": 2048, 00:12:15.349 "data_size": 63488 00:12:15.349 }, 00:12:15.349 { 00:12:15.349 "name": "BaseBdev2", 00:12:15.349 "uuid": "064a62d5-dd6b-493d-aa78-ec1528155828", 00:12:15.349 "is_configured": true, 00:12:15.349 "data_offset": 2048, 00:12:15.349 "data_size": 63488 00:12:15.349 } 00:12:15.349 ] 00:12:15.349 }' 00:12:15.349 22:41:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:15.349 22:41:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:16.281 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:16.540 [2024-07-15 22:41:01.259641] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:16.540 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:16.540 "name": "Existed_Raid", 00:12:16.540 "aliases": [ 00:12:16.540 "7a228447-6eeb-40a9-a218-5ca2ef692437" 00:12:16.540 ], 00:12:16.540 "product_name": "Raid Volume", 00:12:16.540 "block_size": 512, 00:12:16.540 "num_blocks": 126976, 00:12:16.540 "uuid": "7a228447-6eeb-40a9-a218-5ca2ef692437", 00:12:16.540 "assigned_rate_limits": { 00:12:16.540 "rw_ios_per_sec": 0, 00:12:16.540 "rw_mbytes_per_sec": 0, 00:12:16.540 "r_mbytes_per_sec": 0, 00:12:16.540 "w_mbytes_per_sec": 0 00:12:16.540 }, 00:12:16.540 "claimed": false, 00:12:16.540 "zoned": false, 00:12:16.540 "supported_io_types": { 00:12:16.540 "read": true, 00:12:16.540 "write": true, 00:12:16.540 "unmap": true, 00:12:16.540 "flush": true, 00:12:16.540 "reset": true, 00:12:16.540 "nvme_admin": false, 00:12:16.540 "nvme_io": false, 00:12:16.540 "nvme_io_md": false, 00:12:16.540 "write_zeroes": true, 00:12:16.540 "zcopy": false, 00:12:16.540 "get_zone_info": false, 00:12:16.540 "zone_management": false, 00:12:16.540 "zone_append": false, 00:12:16.540 "compare": false, 00:12:16.540 "compare_and_write": false, 00:12:16.540 "abort": false, 00:12:16.540 "seek_hole": false, 00:12:16.540 "seek_data": false, 00:12:16.540 "copy": false, 00:12:16.540 "nvme_iov_md": false 00:12:16.540 }, 00:12:16.540 "memory_domains": [ 00:12:16.540 { 00:12:16.540 "dma_device_id": "system", 00:12:16.540 "dma_device_type": 1 00:12:16.540 }, 00:12:16.540 { 00:12:16.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.540 "dma_device_type": 2 00:12:16.540 }, 00:12:16.540 { 00:12:16.540 "dma_device_id": "system", 00:12:16.540 "dma_device_type": 1 00:12:16.540 }, 00:12:16.540 { 00:12:16.540 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.540 "dma_device_type": 2 00:12:16.540 } 00:12:16.540 ], 00:12:16.540 "driver_specific": { 00:12:16.540 "raid": { 00:12:16.540 "uuid": "7a228447-6eeb-40a9-a218-5ca2ef692437", 00:12:16.540 "strip_size_kb": 64, 00:12:16.540 "state": "online", 00:12:16.540 "raid_level": "concat", 00:12:16.540 "superblock": true, 00:12:16.540 "num_base_bdevs": 2, 00:12:16.540 "num_base_bdevs_discovered": 2, 00:12:16.540 "num_base_bdevs_operational": 2, 00:12:16.540 "base_bdevs_list": [ 00:12:16.540 { 00:12:16.540 "name": "BaseBdev1", 00:12:16.540 "uuid": "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d", 00:12:16.540 "is_configured": true, 00:12:16.540 "data_offset": 2048, 00:12:16.540 "data_size": 63488 00:12:16.540 }, 00:12:16.540 { 00:12:16.540 "name": "BaseBdev2", 00:12:16.540 "uuid": "064a62d5-dd6b-493d-aa78-ec1528155828", 00:12:16.540 "is_configured": true, 00:12:16.540 "data_offset": 2048, 00:12:16.540 "data_size": 63488 00:12:16.540 } 00:12:16.540 ] 00:12:16.540 } 00:12:16.540 } 00:12:16.540 }' 00:12:16.540 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:16.540 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:16.540 BaseBdev2' 00:12:16.540 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:16.540 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:16.540 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:16.798 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:16.798 "name": "BaseBdev1", 00:12:16.798 "aliases": [ 00:12:16.798 "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d" 00:12:16.798 ], 00:12:16.798 "product_name": "Malloc disk", 00:12:16.798 "block_size": 512, 00:12:16.798 "num_blocks": 65536, 00:12:16.798 "uuid": "dd149a87-7b3f-4abd-b1ac-8a5b1e07935d", 00:12:16.798 "assigned_rate_limits": { 00:12:16.798 "rw_ios_per_sec": 0, 00:12:16.798 "rw_mbytes_per_sec": 0, 00:12:16.798 "r_mbytes_per_sec": 0, 00:12:16.798 "w_mbytes_per_sec": 0 00:12:16.798 }, 00:12:16.798 "claimed": true, 00:12:16.798 "claim_type": "exclusive_write", 00:12:16.798 "zoned": false, 00:12:16.798 "supported_io_types": { 00:12:16.798 "read": true, 00:12:16.798 "write": true, 00:12:16.798 "unmap": true, 00:12:16.798 "flush": true, 00:12:16.798 "reset": true, 00:12:16.798 "nvme_admin": false, 00:12:16.798 "nvme_io": false, 00:12:16.798 "nvme_io_md": false, 00:12:16.798 "write_zeroes": true, 00:12:16.798 "zcopy": true, 00:12:16.798 "get_zone_info": false, 00:12:16.798 "zone_management": false, 00:12:16.798 "zone_append": false, 00:12:16.798 "compare": false, 00:12:16.798 "compare_and_write": false, 00:12:16.798 "abort": true, 00:12:16.798 "seek_hole": false, 00:12:16.798 "seek_data": false, 00:12:16.798 "copy": true, 00:12:16.799 "nvme_iov_md": false 00:12:16.799 }, 00:12:16.799 "memory_domains": [ 00:12:16.799 { 00:12:16.799 "dma_device_id": "system", 00:12:16.799 "dma_device_type": 1 00:12:16.799 }, 00:12:16.799 { 00:12:16.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.799 "dma_device_type": 2 00:12:16.799 } 00:12:16.799 ], 00:12:16.799 "driver_specific": {} 00:12:16.799 }' 00:12:16.799 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.799 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:16.799 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:16.799 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:16.799 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:17.057 22:41:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:17.317 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:17.317 "name": "BaseBdev2", 00:12:17.317 "aliases": [ 00:12:17.317 "064a62d5-dd6b-493d-aa78-ec1528155828" 00:12:17.317 ], 00:12:17.317 "product_name": "Malloc disk", 00:12:17.317 "block_size": 512, 00:12:17.317 "num_blocks": 65536, 00:12:17.317 "uuid": "064a62d5-dd6b-493d-aa78-ec1528155828", 00:12:17.317 "assigned_rate_limits": { 00:12:17.317 "rw_ios_per_sec": 0, 00:12:17.317 "rw_mbytes_per_sec": 0, 00:12:17.317 "r_mbytes_per_sec": 0, 00:12:17.317 "w_mbytes_per_sec": 0 00:12:17.317 }, 00:12:17.317 "claimed": true, 00:12:17.317 "claim_type": "exclusive_write", 00:12:17.317 "zoned": false, 00:12:17.317 "supported_io_types": { 00:12:17.317 "read": true, 00:12:17.317 "write": true, 00:12:17.317 "unmap": true, 00:12:17.317 "flush": true, 00:12:17.317 "reset": true, 00:12:17.317 "nvme_admin": false, 00:12:17.317 "nvme_io": false, 00:12:17.317 "nvme_io_md": false, 00:12:17.317 "write_zeroes": true, 00:12:17.317 "zcopy": true, 00:12:17.317 "get_zone_info": false, 00:12:17.317 "zone_management": false, 00:12:17.317 "zone_append": false, 00:12:17.317 "compare": false, 00:12:17.317 "compare_and_write": false, 00:12:17.317 "abort": true, 00:12:17.317 "seek_hole": false, 00:12:17.317 "seek_data": false, 00:12:17.317 "copy": true, 00:12:17.317 "nvme_iov_md": false 00:12:17.317 }, 00:12:17.317 "memory_domains": [ 00:12:17.317 { 00:12:17.317 "dma_device_id": "system", 00:12:17.317 "dma_device_type": 1 00:12:17.317 }, 00:12:17.317 { 00:12:17.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.317 "dma_device_type": 2 00:12:17.317 } 00:12:17.317 ], 00:12:17.317 "driver_specific": {} 00:12:17.317 }' 00:12:17.317 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.317 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:17.317 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:17.317 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:17.576 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:17.835 [2024-07-15 22:41:02.703428] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:17.835 [2024-07-15 22:41:02.703456] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:17.835 [2024-07-15 22:41:02.703498] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.835 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.094 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.094 "name": "Existed_Raid", 00:12:18.094 "uuid": "7a228447-6eeb-40a9-a218-5ca2ef692437", 00:12:18.094 "strip_size_kb": 64, 00:12:18.094 "state": "offline", 00:12:18.094 "raid_level": "concat", 00:12:18.094 "superblock": true, 00:12:18.094 "num_base_bdevs": 2, 00:12:18.094 "num_base_bdevs_discovered": 1, 00:12:18.094 "num_base_bdevs_operational": 1, 00:12:18.094 "base_bdevs_list": [ 00:12:18.094 { 00:12:18.094 "name": null, 00:12:18.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.094 "is_configured": false, 00:12:18.094 "data_offset": 2048, 00:12:18.094 "data_size": 63488 00:12:18.094 }, 00:12:18.094 { 00:12:18.094 "name": "BaseBdev2", 00:12:18.094 "uuid": "064a62d5-dd6b-493d-aa78-ec1528155828", 00:12:18.094 "is_configured": true, 00:12:18.094 "data_offset": 2048, 00:12:18.094 "data_size": 63488 00:12:18.094 } 00:12:18.094 ] 00:12:18.094 }' 00:12:18.094 22:41:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.094 22:41:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:18.695 22:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:18.695 22:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:18.695 22:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.695 22:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:18.953 22:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:18.953 22:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:18.953 22:41:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:19.211 [2024-07-15 22:41:04.048021] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:19.211 [2024-07-15 22:41:04.048070] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eb4000 name Existed_Raid, state offline 00:12:19.211 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:19.211 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:19.211 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.211 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2705360 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2705360 ']' 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2705360 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2705360 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2705360' 00:12:19.470 killing process with pid 2705360 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2705360 00:12:19.470 [2024-07-15 22:41:04.376073] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:19.470 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2705360 00:12:19.470 [2024-07-15 22:41:04.376936] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:19.729 22:41:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:19.729 00:12:19.729 real 0m11.398s 00:12:19.729 user 0m20.446s 00:12:19.729 sys 0m2.024s 00:12:19.729 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:19.729 22:41:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:19.729 ************************************ 00:12:19.729 END TEST raid_state_function_test_sb 00:12:19.729 ************************************ 00:12:19.729 22:41:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:19.729 22:41:04 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:19.729 22:41:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:19.729 22:41:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:19.729 22:41:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:19.989 ************************************ 00:12:19.989 START TEST raid_superblock_test 00:12:19.989 ************************************ 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2707005 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2707005 /var/tmp/spdk-raid.sock 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2707005 ']' 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:19.989 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:19.989 22:41:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.989 [2024-07-15 22:41:04.729724] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:12:19.989 [2024-07-15 22:41:04.729789] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2707005 ] 00:12:19.989 [2024-07-15 22:41:04.848367] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:20.248 [2024-07-15 22:41:04.952856] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:20.248 [2024-07-15 22:41:05.025004] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:20.248 [2024-07-15 22:41:05.025049] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:20.816 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:21.075 malloc1 00:12:21.075 22:41:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:21.334 [2024-07-15 22:41:06.172480] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:21.334 [2024-07-15 22:41:06.172529] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:21.334 [2024-07-15 22:41:06.172550] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x135d570 00:12:21.334 [2024-07-15 22:41:06.172563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:21.334 [2024-07-15 22:41:06.174264] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:21.334 [2024-07-15 22:41:06.174293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:21.334 pt1 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:21.334 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:21.593 malloc2 00:12:21.593 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:21.852 [2024-07-15 22:41:06.667894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:21.852 [2024-07-15 22:41:06.667946] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:21.852 [2024-07-15 22:41:06.667964] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x135e970 00:12:21.852 [2024-07-15 22:41:06.667976] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:21.852 [2024-07-15 22:41:06.669624] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:21.852 [2024-07-15 22:41:06.669653] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:21.852 pt2 00:12:21.852 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:21.852 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:21.852 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:22.112 [2024-07-15 22:41:06.912584] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:22.112 [2024-07-15 22:41:06.913957] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:22.112 [2024-07-15 22:41:06.914105] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1501270 00:12:22.112 [2024-07-15 22:41:06.914118] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:22.112 [2024-07-15 22:41:06.914321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14f6c10 00:12:22.112 [2024-07-15 22:41:06.914467] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1501270 00:12:22.112 [2024-07-15 22:41:06.914478] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1501270 00:12:22.112 [2024-07-15 22:41:06.914582] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.112 22:41:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:22.371 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.371 "name": "raid_bdev1", 00:12:22.371 "uuid": "2c4d4ccb-a1ed-4c42-91cf-632d98473003", 00:12:22.371 "strip_size_kb": 64, 00:12:22.371 "state": "online", 00:12:22.371 "raid_level": "concat", 00:12:22.371 "superblock": true, 00:12:22.371 "num_base_bdevs": 2, 00:12:22.371 "num_base_bdevs_discovered": 2, 00:12:22.371 "num_base_bdevs_operational": 2, 00:12:22.371 "base_bdevs_list": [ 00:12:22.371 { 00:12:22.371 "name": "pt1", 00:12:22.371 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:22.371 "is_configured": true, 00:12:22.371 "data_offset": 2048, 00:12:22.371 "data_size": 63488 00:12:22.371 }, 00:12:22.371 { 00:12:22.371 "name": "pt2", 00:12:22.371 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:22.371 "is_configured": true, 00:12:22.371 "data_offset": 2048, 00:12:22.371 "data_size": 63488 00:12:22.371 } 00:12:22.371 ] 00:12:22.371 }' 00:12:22.371 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.371 22:41:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:22.938 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:23.197 [2024-07-15 22:41:07.883354] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:23.197 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:23.197 "name": "raid_bdev1", 00:12:23.197 "aliases": [ 00:12:23.197 "2c4d4ccb-a1ed-4c42-91cf-632d98473003" 00:12:23.197 ], 00:12:23.197 "product_name": "Raid Volume", 00:12:23.197 "block_size": 512, 00:12:23.197 "num_blocks": 126976, 00:12:23.197 "uuid": "2c4d4ccb-a1ed-4c42-91cf-632d98473003", 00:12:23.197 "assigned_rate_limits": { 00:12:23.197 "rw_ios_per_sec": 0, 00:12:23.197 "rw_mbytes_per_sec": 0, 00:12:23.197 "r_mbytes_per_sec": 0, 00:12:23.197 "w_mbytes_per_sec": 0 00:12:23.197 }, 00:12:23.197 "claimed": false, 00:12:23.197 "zoned": false, 00:12:23.197 "supported_io_types": { 00:12:23.197 "read": true, 00:12:23.197 "write": true, 00:12:23.197 "unmap": true, 00:12:23.197 "flush": true, 00:12:23.197 "reset": true, 00:12:23.198 "nvme_admin": false, 00:12:23.198 "nvme_io": false, 00:12:23.198 "nvme_io_md": false, 00:12:23.198 "write_zeroes": true, 00:12:23.198 "zcopy": false, 00:12:23.198 "get_zone_info": false, 00:12:23.198 "zone_management": false, 00:12:23.198 "zone_append": false, 00:12:23.198 "compare": false, 00:12:23.198 "compare_and_write": false, 00:12:23.198 "abort": false, 00:12:23.198 "seek_hole": false, 00:12:23.198 "seek_data": false, 00:12:23.198 "copy": false, 00:12:23.198 "nvme_iov_md": false 00:12:23.198 }, 00:12:23.198 "memory_domains": [ 00:12:23.198 { 00:12:23.198 "dma_device_id": "system", 00:12:23.198 "dma_device_type": 1 00:12:23.198 }, 00:12:23.198 { 00:12:23.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.198 "dma_device_type": 2 00:12:23.198 }, 00:12:23.198 { 00:12:23.198 "dma_device_id": "system", 00:12:23.198 "dma_device_type": 1 00:12:23.198 }, 00:12:23.198 { 00:12:23.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.198 "dma_device_type": 2 00:12:23.198 } 00:12:23.198 ], 00:12:23.198 "driver_specific": { 00:12:23.198 "raid": { 00:12:23.198 "uuid": "2c4d4ccb-a1ed-4c42-91cf-632d98473003", 00:12:23.198 "strip_size_kb": 64, 00:12:23.198 "state": "online", 00:12:23.198 "raid_level": "concat", 00:12:23.198 "superblock": true, 00:12:23.198 "num_base_bdevs": 2, 00:12:23.198 "num_base_bdevs_discovered": 2, 00:12:23.198 "num_base_bdevs_operational": 2, 00:12:23.198 "base_bdevs_list": [ 00:12:23.198 { 00:12:23.198 "name": "pt1", 00:12:23.198 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:23.198 "is_configured": true, 00:12:23.198 "data_offset": 2048, 00:12:23.198 "data_size": 63488 00:12:23.198 }, 00:12:23.198 { 00:12:23.198 "name": "pt2", 00:12:23.198 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:23.198 "is_configured": true, 00:12:23.198 "data_offset": 2048, 00:12:23.198 "data_size": 63488 00:12:23.198 } 00:12:23.198 ] 00:12:23.198 } 00:12:23.198 } 00:12:23.198 }' 00:12:23.198 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:23.198 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:23.198 pt2' 00:12:23.198 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:23.198 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:23.198 22:41:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:23.766 "name": "pt1", 00:12:23.766 "aliases": [ 00:12:23.766 "00000000-0000-0000-0000-000000000001" 00:12:23.766 ], 00:12:23.766 "product_name": "passthru", 00:12:23.766 "block_size": 512, 00:12:23.766 "num_blocks": 65536, 00:12:23.766 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:23.766 "assigned_rate_limits": { 00:12:23.766 "rw_ios_per_sec": 0, 00:12:23.766 "rw_mbytes_per_sec": 0, 00:12:23.766 "r_mbytes_per_sec": 0, 00:12:23.766 "w_mbytes_per_sec": 0 00:12:23.766 }, 00:12:23.766 "claimed": true, 00:12:23.766 "claim_type": "exclusive_write", 00:12:23.766 "zoned": false, 00:12:23.766 "supported_io_types": { 00:12:23.766 "read": true, 00:12:23.766 "write": true, 00:12:23.766 "unmap": true, 00:12:23.766 "flush": true, 00:12:23.766 "reset": true, 00:12:23.766 "nvme_admin": false, 00:12:23.766 "nvme_io": false, 00:12:23.766 "nvme_io_md": false, 00:12:23.766 "write_zeroes": true, 00:12:23.766 "zcopy": true, 00:12:23.766 "get_zone_info": false, 00:12:23.766 "zone_management": false, 00:12:23.766 "zone_append": false, 00:12:23.766 "compare": false, 00:12:23.766 "compare_and_write": false, 00:12:23.766 "abort": true, 00:12:23.766 "seek_hole": false, 00:12:23.766 "seek_data": false, 00:12:23.766 "copy": true, 00:12:23.766 "nvme_iov_md": false 00:12:23.766 }, 00:12:23.766 "memory_domains": [ 00:12:23.766 { 00:12:23.766 "dma_device_id": "system", 00:12:23.766 "dma_device_type": 1 00:12:23.766 }, 00:12:23.766 { 00:12:23.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.766 "dma_device_type": 2 00:12:23.766 } 00:12:23.766 ], 00:12:23.766 "driver_specific": { 00:12:23.766 "passthru": { 00:12:23.766 "name": "pt1", 00:12:23.766 "base_bdev_name": "malloc1" 00:12:23.766 } 00:12:23.766 } 00:12:23.766 }' 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:23.766 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:24.025 22:41:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:24.592 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:24.592 "name": "pt2", 00:12:24.592 "aliases": [ 00:12:24.592 "00000000-0000-0000-0000-000000000002" 00:12:24.592 ], 00:12:24.592 "product_name": "passthru", 00:12:24.592 "block_size": 512, 00:12:24.592 "num_blocks": 65536, 00:12:24.592 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:24.592 "assigned_rate_limits": { 00:12:24.592 "rw_ios_per_sec": 0, 00:12:24.592 "rw_mbytes_per_sec": 0, 00:12:24.592 "r_mbytes_per_sec": 0, 00:12:24.592 "w_mbytes_per_sec": 0 00:12:24.592 }, 00:12:24.592 "claimed": true, 00:12:24.592 "claim_type": "exclusive_write", 00:12:24.592 "zoned": false, 00:12:24.592 "supported_io_types": { 00:12:24.592 "read": true, 00:12:24.592 "write": true, 00:12:24.592 "unmap": true, 00:12:24.592 "flush": true, 00:12:24.593 "reset": true, 00:12:24.593 "nvme_admin": false, 00:12:24.593 "nvme_io": false, 00:12:24.593 "nvme_io_md": false, 00:12:24.593 "write_zeroes": true, 00:12:24.593 "zcopy": true, 00:12:24.593 "get_zone_info": false, 00:12:24.593 "zone_management": false, 00:12:24.593 "zone_append": false, 00:12:24.593 "compare": false, 00:12:24.593 "compare_and_write": false, 00:12:24.593 "abort": true, 00:12:24.593 "seek_hole": false, 00:12:24.593 "seek_data": false, 00:12:24.593 "copy": true, 00:12:24.593 "nvme_iov_md": false 00:12:24.593 }, 00:12:24.593 "memory_domains": [ 00:12:24.593 { 00:12:24.593 "dma_device_id": "system", 00:12:24.593 "dma_device_type": 1 00:12:24.593 }, 00:12:24.593 { 00:12:24.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:24.593 "dma_device_type": 2 00:12:24.593 } 00:12:24.593 ], 00:12:24.593 "driver_specific": { 00:12:24.593 "passthru": { 00:12:24.593 "name": "pt2", 00:12:24.593 "base_bdev_name": "malloc2" 00:12:24.593 } 00:12:24.593 } 00:12:24.593 }' 00:12:24.593 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.593 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:24.593 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:24.593 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.593 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:24.593 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:24.593 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.851 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:24.851 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:24.851 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.851 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:24.851 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:24.851 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:24.851 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:25.109 [2024-07-15 22:41:09.892712] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:25.109 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2c4d4ccb-a1ed-4c42-91cf-632d98473003 00:12:25.109 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2c4d4ccb-a1ed-4c42-91cf-632d98473003 ']' 00:12:25.109 22:41:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:25.367 [2024-07-15 22:41:10.189240] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:25.367 [2024-07-15 22:41:10.189262] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:25.367 [2024-07-15 22:41:10.189317] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:25.367 [2024-07-15 22:41:10.189361] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:25.367 [2024-07-15 22:41:10.189373] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1501270 name raid_bdev1, state offline 00:12:25.367 22:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.367 22:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:25.935 22:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:25.935 22:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:25.935 22:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:25.935 22:41:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:26.503 22:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:26.503 22:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:26.761 22:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:26.761 22:41:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:27.328 22:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:27.328 22:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:27.328 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:27.328 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:27.328 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:27.328 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:27.329 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:27.329 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:27.329 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:27.329 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:27.329 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:27.329 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:27.329 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:27.896 [2024-07-15 22:41:12.535321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:27.896 [2024-07-15 22:41:12.536710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:27.896 [2024-07-15 22:41:12.536768] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:27.896 [2024-07-15 22:41:12.536812] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:27.896 [2024-07-15 22:41:12.536831] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:27.896 [2024-07-15 22:41:12.536841] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1500ff0 name raid_bdev1, state configuring 00:12:27.896 request: 00:12:27.896 { 00:12:27.896 "name": "raid_bdev1", 00:12:27.896 "raid_level": "concat", 00:12:27.896 "base_bdevs": [ 00:12:27.896 "malloc1", 00:12:27.896 "malloc2" 00:12:27.896 ], 00:12:27.896 "strip_size_kb": 64, 00:12:27.896 "superblock": false, 00:12:27.896 "method": "bdev_raid_create", 00:12:27.896 "req_id": 1 00:12:27.896 } 00:12:27.896 Got JSON-RPC error response 00:12:27.896 response: 00:12:27.896 { 00:12:27.896 "code": -17, 00:12:27.896 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:27.896 } 00:12:27.896 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:27.896 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:27.896 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:27.896 22:41:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:27.896 22:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.896 22:41:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:28.464 [2024-07-15 22:41:13.313292] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:28.464 [2024-07-15 22:41:13.313341] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:28.464 [2024-07-15 22:41:13.313363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x135d7a0 00:12:28.464 [2024-07-15 22:41:13.313375] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:28.464 [2024-07-15 22:41:13.315033] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:28.464 [2024-07-15 22:41:13.315063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:28.464 [2024-07-15 22:41:13.315134] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:28.464 [2024-07-15 22:41:13.315162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:28.464 pt1 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.464 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:28.726 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.726 "name": "raid_bdev1", 00:12:28.726 "uuid": "2c4d4ccb-a1ed-4c42-91cf-632d98473003", 00:12:28.726 "strip_size_kb": 64, 00:12:28.726 "state": "configuring", 00:12:28.726 "raid_level": "concat", 00:12:28.726 "superblock": true, 00:12:28.726 "num_base_bdevs": 2, 00:12:28.726 "num_base_bdevs_discovered": 1, 00:12:28.726 "num_base_bdevs_operational": 2, 00:12:28.726 "base_bdevs_list": [ 00:12:28.726 { 00:12:28.726 "name": "pt1", 00:12:28.726 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:28.726 "is_configured": true, 00:12:28.726 "data_offset": 2048, 00:12:28.726 "data_size": 63488 00:12:28.726 }, 00:12:28.726 { 00:12:28.726 "name": null, 00:12:28.726 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:28.726 "is_configured": false, 00:12:28.726 "data_offset": 2048, 00:12:28.726 "data_size": 63488 00:12:28.726 } 00:12:28.726 ] 00:12:28.726 }' 00:12:28.726 22:41:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.726 22:41:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.661 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:29.661 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:29.661 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:29.661 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:29.920 [2024-07-15 22:41:14.584669] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:29.920 [2024-07-15 22:41:14.584715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:29.920 [2024-07-15 22:41:14.584732] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14f7820 00:12:29.920 [2024-07-15 22:41:14.584745] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:29.920 [2024-07-15 22:41:14.585094] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:29.920 [2024-07-15 22:41:14.585112] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:29.920 [2024-07-15 22:41:14.585173] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:29.920 [2024-07-15 22:41:14.585193] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:29.920 [2024-07-15 22:41:14.585285] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1353ec0 00:12:29.920 [2024-07-15 22:41:14.585296] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:29.920 [2024-07-15 22:41:14.585462] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1354f00 00:12:29.920 [2024-07-15 22:41:14.585588] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1353ec0 00:12:29.920 [2024-07-15 22:41:14.585598] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1353ec0 00:12:29.920 [2024-07-15 22:41:14.585696] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:29.920 pt2 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.920 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:30.179 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.179 "name": "raid_bdev1", 00:12:30.179 "uuid": "2c4d4ccb-a1ed-4c42-91cf-632d98473003", 00:12:30.179 "strip_size_kb": 64, 00:12:30.179 "state": "online", 00:12:30.179 "raid_level": "concat", 00:12:30.179 "superblock": true, 00:12:30.179 "num_base_bdevs": 2, 00:12:30.179 "num_base_bdevs_discovered": 2, 00:12:30.179 "num_base_bdevs_operational": 2, 00:12:30.179 "base_bdevs_list": [ 00:12:30.179 { 00:12:30.179 "name": "pt1", 00:12:30.179 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:30.179 "is_configured": true, 00:12:30.179 "data_offset": 2048, 00:12:30.179 "data_size": 63488 00:12:30.179 }, 00:12:30.179 { 00:12:30.179 "name": "pt2", 00:12:30.179 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:30.179 "is_configured": true, 00:12:30.179 "data_offset": 2048, 00:12:30.179 "data_size": 63488 00:12:30.179 } 00:12:30.179 ] 00:12:30.179 }' 00:12:30.179 22:41:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.179 22:41:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:30.747 [2024-07-15 22:41:15.583564] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:30.747 "name": "raid_bdev1", 00:12:30.747 "aliases": [ 00:12:30.747 "2c4d4ccb-a1ed-4c42-91cf-632d98473003" 00:12:30.747 ], 00:12:30.747 "product_name": "Raid Volume", 00:12:30.747 "block_size": 512, 00:12:30.747 "num_blocks": 126976, 00:12:30.747 "uuid": "2c4d4ccb-a1ed-4c42-91cf-632d98473003", 00:12:30.747 "assigned_rate_limits": { 00:12:30.747 "rw_ios_per_sec": 0, 00:12:30.747 "rw_mbytes_per_sec": 0, 00:12:30.747 "r_mbytes_per_sec": 0, 00:12:30.747 "w_mbytes_per_sec": 0 00:12:30.747 }, 00:12:30.747 "claimed": false, 00:12:30.747 "zoned": false, 00:12:30.747 "supported_io_types": { 00:12:30.747 "read": true, 00:12:30.747 "write": true, 00:12:30.747 "unmap": true, 00:12:30.747 "flush": true, 00:12:30.747 "reset": true, 00:12:30.747 "nvme_admin": false, 00:12:30.747 "nvme_io": false, 00:12:30.747 "nvme_io_md": false, 00:12:30.747 "write_zeroes": true, 00:12:30.747 "zcopy": false, 00:12:30.747 "get_zone_info": false, 00:12:30.747 "zone_management": false, 00:12:30.747 "zone_append": false, 00:12:30.747 "compare": false, 00:12:30.747 "compare_and_write": false, 00:12:30.747 "abort": false, 00:12:30.747 "seek_hole": false, 00:12:30.747 "seek_data": false, 00:12:30.747 "copy": false, 00:12:30.747 "nvme_iov_md": false 00:12:30.747 }, 00:12:30.747 "memory_domains": [ 00:12:30.747 { 00:12:30.747 "dma_device_id": "system", 00:12:30.747 "dma_device_type": 1 00:12:30.747 }, 00:12:30.747 { 00:12:30.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.747 "dma_device_type": 2 00:12:30.747 }, 00:12:30.747 { 00:12:30.747 "dma_device_id": "system", 00:12:30.747 "dma_device_type": 1 00:12:30.747 }, 00:12:30.747 { 00:12:30.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.747 "dma_device_type": 2 00:12:30.747 } 00:12:30.747 ], 00:12:30.747 "driver_specific": { 00:12:30.747 "raid": { 00:12:30.747 "uuid": "2c4d4ccb-a1ed-4c42-91cf-632d98473003", 00:12:30.747 "strip_size_kb": 64, 00:12:30.747 "state": "online", 00:12:30.747 "raid_level": "concat", 00:12:30.747 "superblock": true, 00:12:30.747 "num_base_bdevs": 2, 00:12:30.747 "num_base_bdevs_discovered": 2, 00:12:30.747 "num_base_bdevs_operational": 2, 00:12:30.747 "base_bdevs_list": [ 00:12:30.747 { 00:12:30.747 "name": "pt1", 00:12:30.747 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:30.747 "is_configured": true, 00:12:30.747 "data_offset": 2048, 00:12:30.747 "data_size": 63488 00:12:30.747 }, 00:12:30.747 { 00:12:30.747 "name": "pt2", 00:12:30.747 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:30.747 "is_configured": true, 00:12:30.747 "data_offset": 2048, 00:12:30.747 "data_size": 63488 00:12:30.747 } 00:12:30.747 ] 00:12:30.747 } 00:12:30.747 } 00:12:30.747 }' 00:12:30.747 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:31.006 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:31.006 pt2' 00:12:31.006 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.006 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:31.006 22:41:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:31.574 "name": "pt1", 00:12:31.574 "aliases": [ 00:12:31.574 "00000000-0000-0000-0000-000000000001" 00:12:31.574 ], 00:12:31.574 "product_name": "passthru", 00:12:31.574 "block_size": 512, 00:12:31.574 "num_blocks": 65536, 00:12:31.574 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:31.574 "assigned_rate_limits": { 00:12:31.574 "rw_ios_per_sec": 0, 00:12:31.574 "rw_mbytes_per_sec": 0, 00:12:31.574 "r_mbytes_per_sec": 0, 00:12:31.574 "w_mbytes_per_sec": 0 00:12:31.574 }, 00:12:31.574 "claimed": true, 00:12:31.574 "claim_type": "exclusive_write", 00:12:31.574 "zoned": false, 00:12:31.574 "supported_io_types": { 00:12:31.574 "read": true, 00:12:31.574 "write": true, 00:12:31.574 "unmap": true, 00:12:31.574 "flush": true, 00:12:31.574 "reset": true, 00:12:31.574 "nvme_admin": false, 00:12:31.574 "nvme_io": false, 00:12:31.574 "nvme_io_md": false, 00:12:31.574 "write_zeroes": true, 00:12:31.574 "zcopy": true, 00:12:31.574 "get_zone_info": false, 00:12:31.574 "zone_management": false, 00:12:31.574 "zone_append": false, 00:12:31.574 "compare": false, 00:12:31.574 "compare_and_write": false, 00:12:31.574 "abort": true, 00:12:31.574 "seek_hole": false, 00:12:31.574 "seek_data": false, 00:12:31.574 "copy": true, 00:12:31.574 "nvme_iov_md": false 00:12:31.574 }, 00:12:31.574 "memory_domains": [ 00:12:31.574 { 00:12:31.574 "dma_device_id": "system", 00:12:31.574 "dma_device_type": 1 00:12:31.574 }, 00:12:31.574 { 00:12:31.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.574 "dma_device_type": 2 00:12:31.574 } 00:12:31.574 ], 00:12:31.574 "driver_specific": { 00:12:31.574 "passthru": { 00:12:31.574 "name": "pt1", 00:12:31.574 "base_bdev_name": "malloc1" 00:12:31.574 } 00:12:31.574 } 00:12:31.574 }' 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:31.574 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:31.833 22:41:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.403 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.403 "name": "pt2", 00:12:32.403 "aliases": [ 00:12:32.403 "00000000-0000-0000-0000-000000000002" 00:12:32.403 ], 00:12:32.403 "product_name": "passthru", 00:12:32.403 "block_size": 512, 00:12:32.403 "num_blocks": 65536, 00:12:32.403 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:32.403 "assigned_rate_limits": { 00:12:32.403 "rw_ios_per_sec": 0, 00:12:32.403 "rw_mbytes_per_sec": 0, 00:12:32.403 "r_mbytes_per_sec": 0, 00:12:32.403 "w_mbytes_per_sec": 0 00:12:32.403 }, 00:12:32.403 "claimed": true, 00:12:32.403 "claim_type": "exclusive_write", 00:12:32.403 "zoned": false, 00:12:32.403 "supported_io_types": { 00:12:32.403 "read": true, 00:12:32.403 "write": true, 00:12:32.403 "unmap": true, 00:12:32.403 "flush": true, 00:12:32.403 "reset": true, 00:12:32.403 "nvme_admin": false, 00:12:32.403 "nvme_io": false, 00:12:32.403 "nvme_io_md": false, 00:12:32.403 "write_zeroes": true, 00:12:32.403 "zcopy": true, 00:12:32.403 "get_zone_info": false, 00:12:32.403 "zone_management": false, 00:12:32.403 "zone_append": false, 00:12:32.403 "compare": false, 00:12:32.403 "compare_and_write": false, 00:12:32.403 "abort": true, 00:12:32.403 "seek_hole": false, 00:12:32.403 "seek_data": false, 00:12:32.403 "copy": true, 00:12:32.403 "nvme_iov_md": false 00:12:32.403 }, 00:12:32.403 "memory_domains": [ 00:12:32.403 { 00:12:32.403 "dma_device_id": "system", 00:12:32.403 "dma_device_type": 1 00:12:32.403 }, 00:12:32.403 { 00:12:32.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.403 "dma_device_type": 2 00:12:32.403 } 00:12:32.403 ], 00:12:32.403 "driver_specific": { 00:12:32.403 "passthru": { 00:12:32.403 "name": "pt2", 00:12:32.403 "base_bdev_name": "malloc2" 00:12:32.403 } 00:12:32.403 } 00:12:32.403 }' 00:12:32.403 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.403 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.662 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.662 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.662 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.662 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.662 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.662 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.922 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.922 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.922 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.922 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.922 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:32.922 22:41:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:33.511 [2024-07-15 22:41:18.230603] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2c4d4ccb-a1ed-4c42-91cf-632d98473003 '!=' 2c4d4ccb-a1ed-4c42-91cf-632d98473003 ']' 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2707005 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2707005 ']' 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2707005 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2707005 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2707005' 00:12:33.511 killing process with pid 2707005 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2707005 00:12:33.511 [2024-07-15 22:41:18.315943] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:33.511 [2024-07-15 22:41:18.316001] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:33.511 [2024-07-15 22:41:18.316044] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:33.511 [2024-07-15 22:41:18.316056] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1353ec0 name raid_bdev1, state offline 00:12:33.511 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2707005 00:12:33.511 [2024-07-15 22:41:18.332371] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:33.770 22:41:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:33.770 00:12:33.770 real 0m13.870s 00:12:33.770 user 0m25.227s 00:12:33.770 sys 0m2.241s 00:12:33.770 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:33.770 22:41:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.770 ************************************ 00:12:33.770 END TEST raid_superblock_test 00:12:33.770 ************************************ 00:12:33.770 22:41:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:33.770 22:41:18 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:33.770 22:41:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:33.770 22:41:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:33.770 22:41:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:33.770 ************************************ 00:12:33.770 START TEST raid_read_error_test 00:12:33.770 ************************************ 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.y2hMT3oL9U 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2709059 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2709059 /var/tmp/spdk-raid.sock 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2709059 ']' 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:33.770 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:33.770 22:41:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.028 [2024-07-15 22:41:18.690614] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:12:34.028 [2024-07-15 22:41:18.690683] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2709059 ] 00:12:34.028 [2024-07-15 22:41:18.820323] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:34.028 [2024-07-15 22:41:18.926743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:34.287 [2024-07-15 22:41:18.994753] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:34.287 [2024-07-15 22:41:18.994793] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:34.855 22:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:34.855 22:41:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:34.855 22:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:34.855 22:41:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:35.421 BaseBdev1_malloc 00:12:35.421 22:41:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:35.679 true 00:12:35.679 22:41:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:36.245 [2024-07-15 22:41:20.963454] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:36.245 [2024-07-15 22:41:20.963503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:36.245 [2024-07-15 22:41:20.963524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1b0d0 00:12:36.245 [2024-07-15 22:41:20.963536] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:36.245 [2024-07-15 22:41:20.965458] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:36.245 [2024-07-15 22:41:20.965487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:36.245 BaseBdev1 00:12:36.245 22:41:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:36.245 22:41:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:36.502 BaseBdev2_malloc 00:12:36.502 22:41:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:37.068 true 00:12:37.068 22:41:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:37.326 [2024-07-15 22:41:22.079562] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:37.326 [2024-07-15 22:41:22.079612] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:37.326 [2024-07-15 22:41:22.079634] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1f910 00:12:37.326 [2024-07-15 22:41:22.079646] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:37.326 [2024-07-15 22:41:22.081247] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:37.326 [2024-07-15 22:41:22.081275] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:37.326 BaseBdev2 00:12:37.326 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:37.892 [2024-07-15 22:41:22.580912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:37.892 [2024-07-15 22:41:22.582310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:37.892 [2024-07-15 22:41:22.582513] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c21320 00:12:37.892 [2024-07-15 22:41:22.582528] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:37.892 [2024-07-15 22:41:22.582732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c22290 00:12:37.892 [2024-07-15 22:41:22.582881] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c21320 00:12:37.892 [2024-07-15 22:41:22.582892] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c21320 00:12:37.893 [2024-07-15 22:41:22.583010] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.893 22:41:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:38.460 22:41:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.460 "name": "raid_bdev1", 00:12:38.460 "uuid": "c13aec26-6047-4ce6-a384-b6bd17dd4ae6", 00:12:38.460 "strip_size_kb": 64, 00:12:38.460 "state": "online", 00:12:38.460 "raid_level": "concat", 00:12:38.460 "superblock": true, 00:12:38.460 "num_base_bdevs": 2, 00:12:38.460 "num_base_bdevs_discovered": 2, 00:12:38.460 "num_base_bdevs_operational": 2, 00:12:38.460 "base_bdevs_list": [ 00:12:38.460 { 00:12:38.460 "name": "BaseBdev1", 00:12:38.460 "uuid": "dd4260d0-d8e9-5ae8-8017-a62cef3cab3d", 00:12:38.460 "is_configured": true, 00:12:38.460 "data_offset": 2048, 00:12:38.460 "data_size": 63488 00:12:38.460 }, 00:12:38.460 { 00:12:38.460 "name": "BaseBdev2", 00:12:38.460 "uuid": "840d4793-9016-54e1-bff4-e96a159a45a0", 00:12:38.460 "is_configured": true, 00:12:38.460 "data_offset": 2048, 00:12:38.460 "data_size": 63488 00:12:38.460 } 00:12:38.460 ] 00:12:38.460 }' 00:12:38.460 22:41:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.460 22:41:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.028 22:41:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:39.028 22:41:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:39.028 [2024-07-15 22:41:23.800404] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c1c9b0 00:12:39.965 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.224 22:41:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:40.224 22:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.224 "name": "raid_bdev1", 00:12:40.224 "uuid": "c13aec26-6047-4ce6-a384-b6bd17dd4ae6", 00:12:40.224 "strip_size_kb": 64, 00:12:40.224 "state": "online", 00:12:40.224 "raid_level": "concat", 00:12:40.224 "superblock": true, 00:12:40.224 "num_base_bdevs": 2, 00:12:40.224 "num_base_bdevs_discovered": 2, 00:12:40.224 "num_base_bdevs_operational": 2, 00:12:40.224 "base_bdevs_list": [ 00:12:40.224 { 00:12:40.224 "name": "BaseBdev1", 00:12:40.224 "uuid": "dd4260d0-d8e9-5ae8-8017-a62cef3cab3d", 00:12:40.224 "is_configured": true, 00:12:40.224 "data_offset": 2048, 00:12:40.224 "data_size": 63488 00:12:40.224 }, 00:12:40.224 { 00:12:40.224 "name": "BaseBdev2", 00:12:40.224 "uuid": "840d4793-9016-54e1-bff4-e96a159a45a0", 00:12:40.224 "is_configured": true, 00:12:40.224 "data_offset": 2048, 00:12:40.224 "data_size": 63488 00:12:40.224 } 00:12:40.224 ] 00:12:40.224 }' 00:12:40.224 22:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.224 22:41:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.163 22:41:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:41.423 [2024-07-15 22:41:26.195167] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:41.423 [2024-07-15 22:41:26.195203] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:41.423 [2024-07-15 22:41:26.198381] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:41.423 [2024-07-15 22:41:26.198414] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:41.423 [2024-07-15 22:41:26.198443] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:41.423 [2024-07-15 22:41:26.198454] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c21320 name raid_bdev1, state offline 00:12:41.423 0 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2709059 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2709059 ']' 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2709059 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2709059 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2709059' 00:12:41.423 killing process with pid 2709059 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2709059 00:12:41.423 [2024-07-15 22:41:26.279621] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:41.423 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2709059 00:12:41.423 [2024-07-15 22:41:26.290469] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.y2hMT3oL9U 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:12:41.683 00:12:41.683 real 0m7.909s 00:12:41.683 user 0m12.842s 00:12:41.683 sys 0m1.317s 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:41.683 22:41:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.683 ************************************ 00:12:41.683 END TEST raid_read_error_test 00:12:41.683 ************************************ 00:12:41.683 22:41:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:41.683 22:41:26 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:41.683 22:41:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:41.683 22:41:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:41.683 22:41:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:41.950 ************************************ 00:12:41.950 START TEST raid_write_error_test 00:12:41.950 ************************************ 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.pl7WLKrKAh 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2710130 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2710130 /var/tmp/spdk-raid.sock 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2710130 ']' 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:41.950 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:41.950 22:41:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.950 [2024-07-15 22:41:26.681799] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:12:41.951 [2024-07-15 22:41:26.681869] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2710130 ] 00:12:41.951 [2024-07-15 22:41:26.804404] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:42.209 [2024-07-15 22:41:26.904184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:42.209 [2024-07-15 22:41:26.972688] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:42.209 [2024-07-15 22:41:26.972728] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:42.777 22:41:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:42.777 22:41:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:42.777 22:41:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:42.777 22:41:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:43.037 BaseBdev1_malloc 00:12:43.037 22:41:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:43.296 true 00:12:43.296 22:41:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:43.555 [2024-07-15 22:41:28.308160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:43.555 [2024-07-15 22:41:28.308205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:43.555 [2024-07-15 22:41:28.308228] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff40d0 00:12:43.555 [2024-07-15 22:41:28.308240] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:43.555 [2024-07-15 22:41:28.310134] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:43.555 [2024-07-15 22:41:28.310170] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:43.555 BaseBdev1 00:12:43.556 22:41:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:43.556 22:41:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:43.813 BaseBdev2_malloc 00:12:43.813 22:41:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:44.071 true 00:12:44.071 22:41:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:44.328 [2024-07-15 22:41:29.043910] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:44.328 [2024-07-15 22:41:29.043957] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:44.328 [2024-07-15 22:41:29.043978] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ff8910 00:12:44.328 [2024-07-15 22:41:29.043991] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:44.328 [2024-07-15 22:41:29.045578] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:44.328 [2024-07-15 22:41:29.045606] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:44.328 BaseBdev2 00:12:44.328 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:44.587 [2024-07-15 22:41:29.272536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:44.587 [2024-07-15 22:41:29.273832] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:44.587 [2024-07-15 22:41:29.274030] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ffa320 00:12:44.587 [2024-07-15 22:41:29.274044] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:44.587 [2024-07-15 22:41:29.274238] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ffb290 00:12:44.587 [2024-07-15 22:41:29.274383] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ffa320 00:12:44.587 [2024-07-15 22:41:29.274393] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ffa320 00:12:44.587 [2024-07-15 22:41:29.274495] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.587 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:44.846 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.846 "name": "raid_bdev1", 00:12:44.846 "uuid": "d22d62ba-97d6-408d-9c06-1389f0406233", 00:12:44.846 "strip_size_kb": 64, 00:12:44.846 "state": "online", 00:12:44.846 "raid_level": "concat", 00:12:44.846 "superblock": true, 00:12:44.846 "num_base_bdevs": 2, 00:12:44.846 "num_base_bdevs_discovered": 2, 00:12:44.846 "num_base_bdevs_operational": 2, 00:12:44.846 "base_bdevs_list": [ 00:12:44.846 { 00:12:44.846 "name": "BaseBdev1", 00:12:44.846 "uuid": "b38d3f97-5d59-5330-b3ad-da1c7fe51853", 00:12:44.846 "is_configured": true, 00:12:44.846 "data_offset": 2048, 00:12:44.846 "data_size": 63488 00:12:44.846 }, 00:12:44.846 { 00:12:44.846 "name": "BaseBdev2", 00:12:44.846 "uuid": "80c4e866-b896-5fdc-97d4-080f5981286a", 00:12:44.846 "is_configured": true, 00:12:44.846 "data_offset": 2048, 00:12:44.846 "data_size": 63488 00:12:44.846 } 00:12:44.846 ] 00:12:44.846 }' 00:12:44.846 22:41:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.846 22:41:29 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:45.413 22:41:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:45.413 22:41:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:45.413 [2024-07-15 22:41:30.199266] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ff59b0 00:12:46.350 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.608 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.867 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.867 "name": "raid_bdev1", 00:12:46.867 "uuid": "d22d62ba-97d6-408d-9c06-1389f0406233", 00:12:46.867 "strip_size_kb": 64, 00:12:46.867 "state": "online", 00:12:46.867 "raid_level": "concat", 00:12:46.867 "superblock": true, 00:12:46.867 "num_base_bdevs": 2, 00:12:46.867 "num_base_bdevs_discovered": 2, 00:12:46.867 "num_base_bdevs_operational": 2, 00:12:46.867 "base_bdevs_list": [ 00:12:46.867 { 00:12:46.867 "name": "BaseBdev1", 00:12:46.867 "uuid": "b38d3f97-5d59-5330-b3ad-da1c7fe51853", 00:12:46.867 "is_configured": true, 00:12:46.867 "data_offset": 2048, 00:12:46.867 "data_size": 63488 00:12:46.867 }, 00:12:46.867 { 00:12:46.867 "name": "BaseBdev2", 00:12:46.867 "uuid": "80c4e866-b896-5fdc-97d4-080f5981286a", 00:12:46.867 "is_configured": true, 00:12:46.867 "data_offset": 2048, 00:12:46.867 "data_size": 63488 00:12:46.867 } 00:12:46.867 ] 00:12:46.867 }' 00:12:46.867 22:41:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.867 22:41:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.435 22:41:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:48.039 [2024-07-15 22:41:32.711674] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:48.039 [2024-07-15 22:41:32.711714] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:48.039 [2024-07-15 22:41:32.714876] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:48.039 [2024-07-15 22:41:32.714906] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:48.039 [2024-07-15 22:41:32.714939] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:48.039 [2024-07-15 22:41:32.714951] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ffa320 name raid_bdev1, state offline 00:12:48.039 0 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2710130 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2710130 ']' 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2710130 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2710130 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2710130' 00:12:48.039 killing process with pid 2710130 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2710130 00:12:48.039 [2024-07-15 22:41:32.792515] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:48.039 22:41:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2710130 00:12:48.039 [2024-07-15 22:41:32.802882] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.pl7WLKrKAh 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:12:48.322 00:12:48.322 real 0m6.425s 00:12:48.322 user 0m10.074s 00:12:48.322 sys 0m1.106s 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:48.322 22:41:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.322 ************************************ 00:12:48.322 END TEST raid_write_error_test 00:12:48.322 ************************************ 00:12:48.322 22:41:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:48.322 22:41:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:48.322 22:41:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:48.322 22:41:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:48.322 22:41:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:48.322 22:41:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:48.322 ************************************ 00:12:48.322 START TEST raid_state_function_test 00:12:48.322 ************************************ 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:48.322 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2711114 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2711114' 00:12:48.323 Process raid pid: 2711114 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2711114 /var/tmp/spdk-raid.sock 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2711114 ']' 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:48.323 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:48.323 22:41:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:48.323 [2024-07-15 22:41:33.191539] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:12:48.323 [2024-07-15 22:41:33.191608] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:48.581 [2024-07-15 22:41:33.313642] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:48.581 [2024-07-15 22:41:33.420189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:48.581 [2024-07-15 22:41:33.487156] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:48.581 [2024-07-15 22:41:33.487192] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:49.516 [2024-07-15 22:41:34.391188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:49.516 [2024-07-15 22:41:34.391233] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:49.516 [2024-07-15 22:41:34.391243] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:49.516 [2024-07-15 22:41:34.391256] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.516 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.775 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.775 "name": "Existed_Raid", 00:12:49.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.775 "strip_size_kb": 0, 00:12:49.775 "state": "configuring", 00:12:49.775 "raid_level": "raid1", 00:12:49.775 "superblock": false, 00:12:49.775 "num_base_bdevs": 2, 00:12:49.775 "num_base_bdevs_discovered": 0, 00:12:49.775 "num_base_bdevs_operational": 2, 00:12:49.775 "base_bdevs_list": [ 00:12:49.775 { 00:12:49.775 "name": "BaseBdev1", 00:12:49.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.775 "is_configured": false, 00:12:49.775 "data_offset": 0, 00:12:49.775 "data_size": 0 00:12:49.775 }, 00:12:49.775 { 00:12:49.775 "name": "BaseBdev2", 00:12:49.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.775 "is_configured": false, 00:12:49.775 "data_offset": 0, 00:12:49.775 "data_size": 0 00:12:49.775 } 00:12:49.775 ] 00:12:49.775 }' 00:12:49.775 22:41:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.775 22:41:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.712 22:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:50.713 [2024-07-15 22:41:35.417790] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:50.713 [2024-07-15 22:41:35.417820] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x170da80 name Existed_Raid, state configuring 00:12:50.713 22:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:50.713 [2024-07-15 22:41:35.586246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:50.713 [2024-07-15 22:41:35.586277] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:50.713 [2024-07-15 22:41:35.586286] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:50.713 [2024-07-15 22:41:35.586298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:50.713 22:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:50.972 [2024-07-15 22:41:35.772705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:50.972 BaseBdev1 00:12:50.972 22:41:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:50.972 22:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:50.972 22:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:50.972 22:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:50.972 22:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:50.972 22:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:50.972 22:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:51.231 22:41:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:51.231 [ 00:12:51.231 { 00:12:51.231 "name": "BaseBdev1", 00:12:51.231 "aliases": [ 00:12:51.231 "936c279f-3f2d-4813-91e1-9d84cebbdee2" 00:12:51.231 ], 00:12:51.231 "product_name": "Malloc disk", 00:12:51.231 "block_size": 512, 00:12:51.231 "num_blocks": 65536, 00:12:51.231 "uuid": "936c279f-3f2d-4813-91e1-9d84cebbdee2", 00:12:51.231 "assigned_rate_limits": { 00:12:51.231 "rw_ios_per_sec": 0, 00:12:51.231 "rw_mbytes_per_sec": 0, 00:12:51.231 "r_mbytes_per_sec": 0, 00:12:51.231 "w_mbytes_per_sec": 0 00:12:51.231 }, 00:12:51.231 "claimed": true, 00:12:51.231 "claim_type": "exclusive_write", 00:12:51.231 "zoned": false, 00:12:51.231 "supported_io_types": { 00:12:51.231 "read": true, 00:12:51.231 "write": true, 00:12:51.231 "unmap": true, 00:12:51.231 "flush": true, 00:12:51.231 "reset": true, 00:12:51.231 "nvme_admin": false, 00:12:51.231 "nvme_io": false, 00:12:51.231 "nvme_io_md": false, 00:12:51.231 "write_zeroes": true, 00:12:51.231 "zcopy": true, 00:12:51.231 "get_zone_info": false, 00:12:51.231 "zone_management": false, 00:12:51.231 "zone_append": false, 00:12:51.231 "compare": false, 00:12:51.231 "compare_and_write": false, 00:12:51.231 "abort": true, 00:12:51.231 "seek_hole": false, 00:12:51.231 "seek_data": false, 00:12:51.232 "copy": true, 00:12:51.232 "nvme_iov_md": false 00:12:51.232 }, 00:12:51.232 "memory_domains": [ 00:12:51.232 { 00:12:51.232 "dma_device_id": "system", 00:12:51.232 "dma_device_type": 1 00:12:51.232 }, 00:12:51.232 { 00:12:51.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.232 "dma_device_type": 2 00:12:51.232 } 00:12:51.232 ], 00:12:51.232 "driver_specific": {} 00:12:51.232 } 00:12:51.232 ] 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.232 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.491 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.491 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.491 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.491 "name": "Existed_Raid", 00:12:51.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.491 "strip_size_kb": 0, 00:12:51.491 "state": "configuring", 00:12:51.491 "raid_level": "raid1", 00:12:51.491 "superblock": false, 00:12:51.491 "num_base_bdevs": 2, 00:12:51.491 "num_base_bdevs_discovered": 1, 00:12:51.491 "num_base_bdevs_operational": 2, 00:12:51.491 "base_bdevs_list": [ 00:12:51.491 { 00:12:51.491 "name": "BaseBdev1", 00:12:51.491 "uuid": "936c279f-3f2d-4813-91e1-9d84cebbdee2", 00:12:51.491 "is_configured": true, 00:12:51.491 "data_offset": 0, 00:12:51.491 "data_size": 65536 00:12:51.491 }, 00:12:51.491 { 00:12:51.491 "name": "BaseBdev2", 00:12:51.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.491 "is_configured": false, 00:12:51.491 "data_offset": 0, 00:12:51.491 "data_size": 0 00:12:51.491 } 00:12:51.491 ] 00:12:51.491 }' 00:12:51.491 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.491 22:41:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.060 22:41:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:52.319 [2024-07-15 22:41:37.080160] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:52.319 [2024-07-15 22:41:37.080196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x170d350 name Existed_Raid, state configuring 00:12:52.319 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:52.578 [2024-07-15 22:41:37.248638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:52.578 [2024-07-15 22:41:37.250139] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:52.578 [2024-07-15 22:41:37.250174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:52.578 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:52.579 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.579 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:52.579 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.579 "name": "Existed_Raid", 00:12:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:52.579 "strip_size_kb": 0, 00:12:52.579 "state": "configuring", 00:12:52.579 "raid_level": "raid1", 00:12:52.579 "superblock": false, 00:12:52.579 "num_base_bdevs": 2, 00:12:52.579 "num_base_bdevs_discovered": 1, 00:12:52.579 "num_base_bdevs_operational": 2, 00:12:52.579 "base_bdevs_list": [ 00:12:52.579 { 00:12:52.579 "name": "BaseBdev1", 00:12:52.579 "uuid": "936c279f-3f2d-4813-91e1-9d84cebbdee2", 00:12:52.579 "is_configured": true, 00:12:52.579 "data_offset": 0, 00:12:52.579 "data_size": 65536 00:12:52.579 }, 00:12:52.579 { 00:12:52.579 "name": "BaseBdev2", 00:12:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:52.579 "is_configured": false, 00:12:52.579 "data_offset": 0, 00:12:52.579 "data_size": 0 00:12:52.579 } 00:12:52.579 ] 00:12:52.579 }' 00:12:52.579 22:41:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.579 22:41:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:53.514 [2024-07-15 22:41:38.286708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:53.514 [2024-07-15 22:41:38.286746] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x170e000 00:12:53.514 [2024-07-15 22:41:38.286755] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:53.514 [2024-07-15 22:41:38.286967] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16280c0 00:12:53.514 [2024-07-15 22:41:38.287090] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x170e000 00:12:53.514 [2024-07-15 22:41:38.287100] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x170e000 00:12:53.514 [2024-07-15 22:41:38.287268] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:53.514 BaseBdev2 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:53.514 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:53.773 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:54.031 [ 00:12:54.031 { 00:12:54.031 "name": "BaseBdev2", 00:12:54.031 "aliases": [ 00:12:54.031 "e33bdffc-ec8c-421a-962f-0befa3305dc4" 00:12:54.031 ], 00:12:54.031 "product_name": "Malloc disk", 00:12:54.031 "block_size": 512, 00:12:54.031 "num_blocks": 65536, 00:12:54.031 "uuid": "e33bdffc-ec8c-421a-962f-0befa3305dc4", 00:12:54.031 "assigned_rate_limits": { 00:12:54.031 "rw_ios_per_sec": 0, 00:12:54.031 "rw_mbytes_per_sec": 0, 00:12:54.031 "r_mbytes_per_sec": 0, 00:12:54.031 "w_mbytes_per_sec": 0 00:12:54.031 }, 00:12:54.031 "claimed": true, 00:12:54.031 "claim_type": "exclusive_write", 00:12:54.031 "zoned": false, 00:12:54.031 "supported_io_types": { 00:12:54.031 "read": true, 00:12:54.031 "write": true, 00:12:54.031 "unmap": true, 00:12:54.031 "flush": true, 00:12:54.031 "reset": true, 00:12:54.031 "nvme_admin": false, 00:12:54.031 "nvme_io": false, 00:12:54.031 "nvme_io_md": false, 00:12:54.031 "write_zeroes": true, 00:12:54.031 "zcopy": true, 00:12:54.031 "get_zone_info": false, 00:12:54.031 "zone_management": false, 00:12:54.031 "zone_append": false, 00:12:54.031 "compare": false, 00:12:54.031 "compare_and_write": false, 00:12:54.031 "abort": true, 00:12:54.031 "seek_hole": false, 00:12:54.031 "seek_data": false, 00:12:54.031 "copy": true, 00:12:54.031 "nvme_iov_md": false 00:12:54.031 }, 00:12:54.031 "memory_domains": [ 00:12:54.031 { 00:12:54.031 "dma_device_id": "system", 00:12:54.031 "dma_device_type": 1 00:12:54.031 }, 00:12:54.031 { 00:12:54.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.031 "dma_device_type": 2 00:12:54.031 } 00:12:54.031 ], 00:12:54.031 "driver_specific": {} 00:12:54.031 } 00:12:54.031 ] 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.031 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.032 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.032 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.032 22:41:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:54.290 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:54.290 "name": "Existed_Raid", 00:12:54.290 "uuid": "b13f3b7b-91c7-436f-b150-f52bbf5d8e36", 00:12:54.290 "strip_size_kb": 0, 00:12:54.290 "state": "online", 00:12:54.290 "raid_level": "raid1", 00:12:54.290 "superblock": false, 00:12:54.290 "num_base_bdevs": 2, 00:12:54.290 "num_base_bdevs_discovered": 2, 00:12:54.290 "num_base_bdevs_operational": 2, 00:12:54.290 "base_bdevs_list": [ 00:12:54.290 { 00:12:54.290 "name": "BaseBdev1", 00:12:54.291 "uuid": "936c279f-3f2d-4813-91e1-9d84cebbdee2", 00:12:54.291 "is_configured": true, 00:12:54.291 "data_offset": 0, 00:12:54.291 "data_size": 65536 00:12:54.291 }, 00:12:54.291 { 00:12:54.291 "name": "BaseBdev2", 00:12:54.291 "uuid": "e33bdffc-ec8c-421a-962f-0befa3305dc4", 00:12:54.291 "is_configured": true, 00:12:54.291 "data_offset": 0, 00:12:54.291 "data_size": 65536 00:12:54.291 } 00:12:54.291 ] 00:12:54.291 }' 00:12:54.291 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:54.291 22:41:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:54.857 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:55.114 [2024-07-15 22:41:39.891238] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:55.114 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:55.114 "name": "Existed_Raid", 00:12:55.114 "aliases": [ 00:12:55.114 "b13f3b7b-91c7-436f-b150-f52bbf5d8e36" 00:12:55.114 ], 00:12:55.114 "product_name": "Raid Volume", 00:12:55.114 "block_size": 512, 00:12:55.114 "num_blocks": 65536, 00:12:55.114 "uuid": "b13f3b7b-91c7-436f-b150-f52bbf5d8e36", 00:12:55.114 "assigned_rate_limits": { 00:12:55.114 "rw_ios_per_sec": 0, 00:12:55.114 "rw_mbytes_per_sec": 0, 00:12:55.114 "r_mbytes_per_sec": 0, 00:12:55.114 "w_mbytes_per_sec": 0 00:12:55.114 }, 00:12:55.114 "claimed": false, 00:12:55.114 "zoned": false, 00:12:55.115 "supported_io_types": { 00:12:55.115 "read": true, 00:12:55.115 "write": true, 00:12:55.115 "unmap": false, 00:12:55.115 "flush": false, 00:12:55.115 "reset": true, 00:12:55.115 "nvme_admin": false, 00:12:55.115 "nvme_io": false, 00:12:55.115 "nvme_io_md": false, 00:12:55.115 "write_zeroes": true, 00:12:55.115 "zcopy": false, 00:12:55.115 "get_zone_info": false, 00:12:55.115 "zone_management": false, 00:12:55.115 "zone_append": false, 00:12:55.115 "compare": false, 00:12:55.115 "compare_and_write": false, 00:12:55.115 "abort": false, 00:12:55.115 "seek_hole": false, 00:12:55.115 "seek_data": false, 00:12:55.115 "copy": false, 00:12:55.115 "nvme_iov_md": false 00:12:55.115 }, 00:12:55.115 "memory_domains": [ 00:12:55.115 { 00:12:55.115 "dma_device_id": "system", 00:12:55.115 "dma_device_type": 1 00:12:55.115 }, 00:12:55.115 { 00:12:55.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.115 "dma_device_type": 2 00:12:55.115 }, 00:12:55.115 { 00:12:55.115 "dma_device_id": "system", 00:12:55.115 "dma_device_type": 1 00:12:55.115 }, 00:12:55.115 { 00:12:55.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.115 "dma_device_type": 2 00:12:55.115 } 00:12:55.115 ], 00:12:55.115 "driver_specific": { 00:12:55.115 "raid": { 00:12:55.115 "uuid": "b13f3b7b-91c7-436f-b150-f52bbf5d8e36", 00:12:55.115 "strip_size_kb": 0, 00:12:55.115 "state": "online", 00:12:55.115 "raid_level": "raid1", 00:12:55.115 "superblock": false, 00:12:55.115 "num_base_bdevs": 2, 00:12:55.115 "num_base_bdevs_discovered": 2, 00:12:55.115 "num_base_bdevs_operational": 2, 00:12:55.115 "base_bdevs_list": [ 00:12:55.115 { 00:12:55.115 "name": "BaseBdev1", 00:12:55.115 "uuid": "936c279f-3f2d-4813-91e1-9d84cebbdee2", 00:12:55.115 "is_configured": true, 00:12:55.115 "data_offset": 0, 00:12:55.115 "data_size": 65536 00:12:55.115 }, 00:12:55.115 { 00:12:55.115 "name": "BaseBdev2", 00:12:55.115 "uuid": "e33bdffc-ec8c-421a-962f-0befa3305dc4", 00:12:55.115 "is_configured": true, 00:12:55.115 "data_offset": 0, 00:12:55.115 "data_size": 65536 00:12:55.115 } 00:12:55.115 ] 00:12:55.115 } 00:12:55.115 } 00:12:55.115 }' 00:12:55.115 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:55.115 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:55.115 BaseBdev2' 00:12:55.115 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:55.115 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:55.115 22:41:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:55.372 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:55.372 "name": "BaseBdev1", 00:12:55.372 "aliases": [ 00:12:55.372 "936c279f-3f2d-4813-91e1-9d84cebbdee2" 00:12:55.372 ], 00:12:55.372 "product_name": "Malloc disk", 00:12:55.372 "block_size": 512, 00:12:55.372 "num_blocks": 65536, 00:12:55.372 "uuid": "936c279f-3f2d-4813-91e1-9d84cebbdee2", 00:12:55.372 "assigned_rate_limits": { 00:12:55.372 "rw_ios_per_sec": 0, 00:12:55.372 "rw_mbytes_per_sec": 0, 00:12:55.372 "r_mbytes_per_sec": 0, 00:12:55.372 "w_mbytes_per_sec": 0 00:12:55.372 }, 00:12:55.372 "claimed": true, 00:12:55.372 "claim_type": "exclusive_write", 00:12:55.372 "zoned": false, 00:12:55.372 "supported_io_types": { 00:12:55.372 "read": true, 00:12:55.372 "write": true, 00:12:55.372 "unmap": true, 00:12:55.372 "flush": true, 00:12:55.372 "reset": true, 00:12:55.372 "nvme_admin": false, 00:12:55.372 "nvme_io": false, 00:12:55.372 "nvme_io_md": false, 00:12:55.372 "write_zeroes": true, 00:12:55.372 "zcopy": true, 00:12:55.372 "get_zone_info": false, 00:12:55.372 "zone_management": false, 00:12:55.372 "zone_append": false, 00:12:55.372 "compare": false, 00:12:55.372 "compare_and_write": false, 00:12:55.372 "abort": true, 00:12:55.372 "seek_hole": false, 00:12:55.372 "seek_data": false, 00:12:55.372 "copy": true, 00:12:55.372 "nvme_iov_md": false 00:12:55.372 }, 00:12:55.372 "memory_domains": [ 00:12:55.372 { 00:12:55.372 "dma_device_id": "system", 00:12:55.372 "dma_device_type": 1 00:12:55.372 }, 00:12:55.372 { 00:12:55.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:55.372 "dma_device_type": 2 00:12:55.372 } 00:12:55.372 ], 00:12:55.372 "driver_specific": {} 00:12:55.372 }' 00:12:55.372 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.372 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.630 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:55.888 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:55.888 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:55.888 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:55.888 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:56.146 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:56.146 "name": "BaseBdev2", 00:12:56.146 "aliases": [ 00:12:56.146 "e33bdffc-ec8c-421a-962f-0befa3305dc4" 00:12:56.146 ], 00:12:56.146 "product_name": "Malloc disk", 00:12:56.146 "block_size": 512, 00:12:56.146 "num_blocks": 65536, 00:12:56.146 "uuid": "e33bdffc-ec8c-421a-962f-0befa3305dc4", 00:12:56.146 "assigned_rate_limits": { 00:12:56.146 "rw_ios_per_sec": 0, 00:12:56.146 "rw_mbytes_per_sec": 0, 00:12:56.146 "r_mbytes_per_sec": 0, 00:12:56.146 "w_mbytes_per_sec": 0 00:12:56.146 }, 00:12:56.146 "claimed": true, 00:12:56.146 "claim_type": "exclusive_write", 00:12:56.146 "zoned": false, 00:12:56.146 "supported_io_types": { 00:12:56.146 "read": true, 00:12:56.146 "write": true, 00:12:56.146 "unmap": true, 00:12:56.146 "flush": true, 00:12:56.146 "reset": true, 00:12:56.146 "nvme_admin": false, 00:12:56.146 "nvme_io": false, 00:12:56.146 "nvme_io_md": false, 00:12:56.146 "write_zeroes": true, 00:12:56.146 "zcopy": true, 00:12:56.146 "get_zone_info": false, 00:12:56.146 "zone_management": false, 00:12:56.146 "zone_append": false, 00:12:56.146 "compare": false, 00:12:56.146 "compare_and_write": false, 00:12:56.146 "abort": true, 00:12:56.146 "seek_hole": false, 00:12:56.146 "seek_data": false, 00:12:56.146 "copy": true, 00:12:56.146 "nvme_iov_md": false 00:12:56.146 }, 00:12:56.146 "memory_domains": [ 00:12:56.146 { 00:12:56.146 "dma_device_id": "system", 00:12:56.146 "dma_device_type": 1 00:12:56.146 }, 00:12:56.146 { 00:12:56.146 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.146 "dma_device_type": 2 00:12:56.146 } 00:12:56.146 ], 00:12:56.146 "driver_specific": {} 00:12:56.146 }' 00:12:56.146 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.146 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:56.146 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:56.146 22:41:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.146 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:56.146 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:56.146 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.405 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:56.405 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:56.405 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.405 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:56.664 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:56.664 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:56.923 [2024-07-15 22:41:41.611606] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.923 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.182 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.182 "name": "Existed_Raid", 00:12:57.182 "uuid": "b13f3b7b-91c7-436f-b150-f52bbf5d8e36", 00:12:57.182 "strip_size_kb": 0, 00:12:57.182 "state": "online", 00:12:57.182 "raid_level": "raid1", 00:12:57.182 "superblock": false, 00:12:57.182 "num_base_bdevs": 2, 00:12:57.182 "num_base_bdevs_discovered": 1, 00:12:57.182 "num_base_bdevs_operational": 1, 00:12:57.182 "base_bdevs_list": [ 00:12:57.182 { 00:12:57.182 "name": null, 00:12:57.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:57.182 "is_configured": false, 00:12:57.182 "data_offset": 0, 00:12:57.182 "data_size": 65536 00:12:57.182 }, 00:12:57.182 { 00:12:57.182 "name": "BaseBdev2", 00:12:57.182 "uuid": "e33bdffc-ec8c-421a-962f-0befa3305dc4", 00:12:57.182 "is_configured": true, 00:12:57.182 "data_offset": 0, 00:12:57.182 "data_size": 65536 00:12:57.182 } 00:12:57.182 ] 00:12:57.182 }' 00:12:57.182 22:41:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.182 22:41:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.120 22:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:58.120 22:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:58.120 22:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.120 22:41:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:58.120 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:58.120 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:58.120 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:58.379 [2024-07-15 22:41:43.237866] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:58.379 [2024-07-15 22:41:43.237956] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:58.379 [2024-07-15 22:41:43.248867] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:58.379 [2024-07-15 22:41:43.248899] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:58.379 [2024-07-15 22:41:43.248910] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x170e000 name Existed_Raid, state offline 00:12:58.379 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:58.379 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:58.379 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.379 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2711114 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2711114 ']' 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2711114 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2711114 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2711114' 00:12:58.637 killing process with pid 2711114 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2711114 00:12:58.637 [2024-07-15 22:41:43.505403] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:58.637 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2711114 00:12:58.637 [2024-07-15 22:41:43.506277] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:58.896 00:12:58.896 real 0m10.589s 00:12:58.896 user 0m18.820s 00:12:58.896 sys 0m1.994s 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.896 ************************************ 00:12:58.896 END TEST raid_state_function_test 00:12:58.896 ************************************ 00:12:58.896 22:41:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:58.896 22:41:43 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:58.896 22:41:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:58.896 22:41:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:58.896 22:41:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:58.896 ************************************ 00:12:58.896 START TEST raid_state_function_test_sb 00:12:58.896 ************************************ 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:58.896 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2712749 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2712749' 00:12:58.897 Process raid pid: 2712749 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2712749 /var/tmp/spdk-raid.sock 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2712749 ']' 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:58.897 22:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:58.897 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:59.156 22:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:59.156 22:41:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:59.156 [2024-07-15 22:41:43.858488] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:12:59.156 [2024-07-15 22:41:43.858552] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:59.156 [2024-07-15 22:41:43.988961] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:59.415 [2024-07-15 22:41:44.092267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:59.415 [2024-07-15 22:41:44.158950] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.415 [2024-07-15 22:41:44.158984] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.983 22:41:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:59.983 22:41:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:59.983 22:41:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:00.241 [2024-07-15 22:41:45.018783] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:00.241 [2024-07-15 22:41:45.018826] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:00.241 [2024-07-15 22:41:45.018838] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:00.241 [2024-07-15 22:41:45.018851] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:00.241 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:00.241 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:00.241 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.241 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:00.241 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:00.241 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:00.242 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.242 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.242 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.242 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.242 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.242 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.501 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.501 "name": "Existed_Raid", 00:13:00.501 "uuid": "c6461e15-d41a-481a-8904-2eaa02686312", 00:13:00.501 "strip_size_kb": 0, 00:13:00.501 "state": "configuring", 00:13:00.501 "raid_level": "raid1", 00:13:00.501 "superblock": true, 00:13:00.501 "num_base_bdevs": 2, 00:13:00.501 "num_base_bdevs_discovered": 0, 00:13:00.501 "num_base_bdevs_operational": 2, 00:13:00.501 "base_bdevs_list": [ 00:13:00.501 { 00:13:00.501 "name": "BaseBdev1", 00:13:00.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.501 "is_configured": false, 00:13:00.501 "data_offset": 0, 00:13:00.501 "data_size": 0 00:13:00.501 }, 00:13:00.501 { 00:13:00.501 "name": "BaseBdev2", 00:13:00.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.501 "is_configured": false, 00:13:00.501 "data_offset": 0, 00:13:00.501 "data_size": 0 00:13:00.501 } 00:13:00.501 ] 00:13:00.501 }' 00:13:00.501 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.501 22:41:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:01.069 22:41:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:01.328 [2024-07-15 22:41:46.077462] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:01.328 [2024-07-15 22:41:46.077494] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2613a80 name Existed_Raid, state configuring 00:13:01.328 22:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:01.587 [2024-07-15 22:41:46.322128] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:01.587 [2024-07-15 22:41:46.322160] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:01.587 [2024-07-15 22:41:46.322169] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:01.587 [2024-07-15 22:41:46.322181] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:01.587 22:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:01.845 [2024-07-15 22:41:46.573890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:01.845 BaseBdev1 00:13:01.845 22:41:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:01.845 22:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:01.845 22:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:01.845 22:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:01.845 22:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:01.845 22:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:01.845 22:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:02.104 22:41:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:02.363 [ 00:13:02.363 { 00:13:02.363 "name": "BaseBdev1", 00:13:02.363 "aliases": [ 00:13:02.363 "73c94f43-72bb-4922-bf85-ababe1e0723b" 00:13:02.363 ], 00:13:02.363 "product_name": "Malloc disk", 00:13:02.363 "block_size": 512, 00:13:02.363 "num_blocks": 65536, 00:13:02.363 "uuid": "73c94f43-72bb-4922-bf85-ababe1e0723b", 00:13:02.363 "assigned_rate_limits": { 00:13:02.363 "rw_ios_per_sec": 0, 00:13:02.363 "rw_mbytes_per_sec": 0, 00:13:02.363 "r_mbytes_per_sec": 0, 00:13:02.363 "w_mbytes_per_sec": 0 00:13:02.363 }, 00:13:02.363 "claimed": true, 00:13:02.363 "claim_type": "exclusive_write", 00:13:02.363 "zoned": false, 00:13:02.363 "supported_io_types": { 00:13:02.363 "read": true, 00:13:02.363 "write": true, 00:13:02.363 "unmap": true, 00:13:02.363 "flush": true, 00:13:02.363 "reset": true, 00:13:02.363 "nvme_admin": false, 00:13:02.363 "nvme_io": false, 00:13:02.363 "nvme_io_md": false, 00:13:02.363 "write_zeroes": true, 00:13:02.363 "zcopy": true, 00:13:02.363 "get_zone_info": false, 00:13:02.363 "zone_management": false, 00:13:02.363 "zone_append": false, 00:13:02.363 "compare": false, 00:13:02.363 "compare_and_write": false, 00:13:02.363 "abort": true, 00:13:02.363 "seek_hole": false, 00:13:02.363 "seek_data": false, 00:13:02.363 "copy": true, 00:13:02.363 "nvme_iov_md": false 00:13:02.363 }, 00:13:02.363 "memory_domains": [ 00:13:02.363 { 00:13:02.363 "dma_device_id": "system", 00:13:02.363 "dma_device_type": 1 00:13:02.363 }, 00:13:02.363 { 00:13:02.363 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.363 "dma_device_type": 2 00:13:02.363 } 00:13:02.363 ], 00:13:02.363 "driver_specific": {} 00:13:02.363 } 00:13:02.363 ] 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.363 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.662 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.662 "name": "Existed_Raid", 00:13:02.662 "uuid": "cf5c1ea2-0eab-40e6-b427-c1eb32876bd4", 00:13:02.662 "strip_size_kb": 0, 00:13:02.662 "state": "configuring", 00:13:02.662 "raid_level": "raid1", 00:13:02.662 "superblock": true, 00:13:02.662 "num_base_bdevs": 2, 00:13:02.662 "num_base_bdevs_discovered": 1, 00:13:02.662 "num_base_bdevs_operational": 2, 00:13:02.662 "base_bdevs_list": [ 00:13:02.662 { 00:13:02.662 "name": "BaseBdev1", 00:13:02.662 "uuid": "73c94f43-72bb-4922-bf85-ababe1e0723b", 00:13:02.662 "is_configured": true, 00:13:02.662 "data_offset": 2048, 00:13:02.662 "data_size": 63488 00:13:02.662 }, 00:13:02.662 { 00:13:02.662 "name": "BaseBdev2", 00:13:02.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.662 "is_configured": false, 00:13:02.662 "data_offset": 0, 00:13:02.662 "data_size": 0 00:13:02.662 } 00:13:02.662 ] 00:13:02.662 }' 00:13:02.662 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.662 22:41:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:03.261 22:41:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:03.261 [2024-07-15 22:41:48.057807] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:03.261 [2024-07-15 22:41:48.057847] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2613350 name Existed_Raid, state configuring 00:13:03.261 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:03.520 [2024-07-15 22:41:48.238329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:03.520 [2024-07-15 22:41:48.239810] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:03.520 [2024-07-15 22:41:48.239846] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:03.520 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:03.520 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:03.520 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:03.520 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.520 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.521 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.780 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.780 "name": "Existed_Raid", 00:13:03.780 "uuid": "258ef3d6-8fcc-445d-98ee-67be77e84256", 00:13:03.780 "strip_size_kb": 0, 00:13:03.780 "state": "configuring", 00:13:03.780 "raid_level": "raid1", 00:13:03.780 "superblock": true, 00:13:03.780 "num_base_bdevs": 2, 00:13:03.780 "num_base_bdevs_discovered": 1, 00:13:03.780 "num_base_bdevs_operational": 2, 00:13:03.780 "base_bdevs_list": [ 00:13:03.780 { 00:13:03.780 "name": "BaseBdev1", 00:13:03.780 "uuid": "73c94f43-72bb-4922-bf85-ababe1e0723b", 00:13:03.780 "is_configured": true, 00:13:03.780 "data_offset": 2048, 00:13:03.780 "data_size": 63488 00:13:03.780 }, 00:13:03.780 { 00:13:03.780 "name": "BaseBdev2", 00:13:03.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:03.780 "is_configured": false, 00:13:03.780 "data_offset": 0, 00:13:03.780 "data_size": 0 00:13:03.780 } 00:13:03.780 ] 00:13:03.780 }' 00:13:03.780 22:41:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.780 22:41:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:04.347 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:04.606 [2024-07-15 22:41:49.288504] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:04.606 [2024-07-15 22:41:49.288649] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2614000 00:13:04.606 [2024-07-15 22:41:49.288663] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:04.606 [2024-07-15 22:41:49.288832] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252e0c0 00:13:04.606 [2024-07-15 22:41:49.288960] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2614000 00:13:04.606 [2024-07-15 22:41:49.288971] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2614000 00:13:04.606 [2024-07-15 22:41:49.289064] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.606 BaseBdev2 00:13:04.606 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:04.606 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:04.606 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:04.606 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:04.606 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:04.606 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:04.606 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:04.864 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:05.121 [ 00:13:05.121 { 00:13:05.121 "name": "BaseBdev2", 00:13:05.121 "aliases": [ 00:13:05.121 "4ff4219f-7a80-4c3c-9a31-7c69c62bcaec" 00:13:05.121 ], 00:13:05.121 "product_name": "Malloc disk", 00:13:05.121 "block_size": 512, 00:13:05.121 "num_blocks": 65536, 00:13:05.121 "uuid": "4ff4219f-7a80-4c3c-9a31-7c69c62bcaec", 00:13:05.121 "assigned_rate_limits": { 00:13:05.121 "rw_ios_per_sec": 0, 00:13:05.121 "rw_mbytes_per_sec": 0, 00:13:05.121 "r_mbytes_per_sec": 0, 00:13:05.121 "w_mbytes_per_sec": 0 00:13:05.121 }, 00:13:05.121 "claimed": true, 00:13:05.121 "claim_type": "exclusive_write", 00:13:05.121 "zoned": false, 00:13:05.121 "supported_io_types": { 00:13:05.121 "read": true, 00:13:05.121 "write": true, 00:13:05.121 "unmap": true, 00:13:05.121 "flush": true, 00:13:05.121 "reset": true, 00:13:05.121 "nvme_admin": false, 00:13:05.121 "nvme_io": false, 00:13:05.121 "nvme_io_md": false, 00:13:05.121 "write_zeroes": true, 00:13:05.121 "zcopy": true, 00:13:05.121 "get_zone_info": false, 00:13:05.121 "zone_management": false, 00:13:05.121 "zone_append": false, 00:13:05.121 "compare": false, 00:13:05.121 "compare_and_write": false, 00:13:05.121 "abort": true, 00:13:05.121 "seek_hole": false, 00:13:05.121 "seek_data": false, 00:13:05.121 "copy": true, 00:13:05.121 "nvme_iov_md": false 00:13:05.121 }, 00:13:05.121 "memory_domains": [ 00:13:05.121 { 00:13:05.121 "dma_device_id": "system", 00:13:05.121 "dma_device_type": 1 00:13:05.121 }, 00:13:05.121 { 00:13:05.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.121 "dma_device_type": 2 00:13:05.121 } 00:13:05.121 ], 00:13:05.121 "driver_specific": {} 00:13:05.121 } 00:13:05.121 ] 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.121 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.121 "name": "Existed_Raid", 00:13:05.121 "uuid": "258ef3d6-8fcc-445d-98ee-67be77e84256", 00:13:05.121 "strip_size_kb": 0, 00:13:05.121 "state": "online", 00:13:05.121 "raid_level": "raid1", 00:13:05.121 "superblock": true, 00:13:05.121 "num_base_bdevs": 2, 00:13:05.121 "num_base_bdevs_discovered": 2, 00:13:05.121 "num_base_bdevs_operational": 2, 00:13:05.121 "base_bdevs_list": [ 00:13:05.121 { 00:13:05.121 "name": "BaseBdev1", 00:13:05.121 "uuid": "73c94f43-72bb-4922-bf85-ababe1e0723b", 00:13:05.121 "is_configured": true, 00:13:05.121 "data_offset": 2048, 00:13:05.121 "data_size": 63488 00:13:05.121 }, 00:13:05.121 { 00:13:05.121 "name": "BaseBdev2", 00:13:05.121 "uuid": "4ff4219f-7a80-4c3c-9a31-7c69c62bcaec", 00:13:05.122 "is_configured": true, 00:13:05.122 "data_offset": 2048, 00:13:05.122 "data_size": 63488 00:13:05.122 } 00:13:05.122 ] 00:13:05.122 }' 00:13:05.122 22:41:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.122 22:41:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:06.055 [2024-07-15 22:41:50.752651] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:06.055 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:06.055 "name": "Existed_Raid", 00:13:06.056 "aliases": [ 00:13:06.056 "258ef3d6-8fcc-445d-98ee-67be77e84256" 00:13:06.056 ], 00:13:06.056 "product_name": "Raid Volume", 00:13:06.056 "block_size": 512, 00:13:06.056 "num_blocks": 63488, 00:13:06.056 "uuid": "258ef3d6-8fcc-445d-98ee-67be77e84256", 00:13:06.056 "assigned_rate_limits": { 00:13:06.056 "rw_ios_per_sec": 0, 00:13:06.056 "rw_mbytes_per_sec": 0, 00:13:06.056 "r_mbytes_per_sec": 0, 00:13:06.056 "w_mbytes_per_sec": 0 00:13:06.056 }, 00:13:06.056 "claimed": false, 00:13:06.056 "zoned": false, 00:13:06.056 "supported_io_types": { 00:13:06.056 "read": true, 00:13:06.056 "write": true, 00:13:06.056 "unmap": false, 00:13:06.056 "flush": false, 00:13:06.056 "reset": true, 00:13:06.056 "nvme_admin": false, 00:13:06.056 "nvme_io": false, 00:13:06.056 "nvme_io_md": false, 00:13:06.056 "write_zeroes": true, 00:13:06.056 "zcopy": false, 00:13:06.056 "get_zone_info": false, 00:13:06.056 "zone_management": false, 00:13:06.056 "zone_append": false, 00:13:06.056 "compare": false, 00:13:06.056 "compare_and_write": false, 00:13:06.056 "abort": false, 00:13:06.056 "seek_hole": false, 00:13:06.056 "seek_data": false, 00:13:06.056 "copy": false, 00:13:06.056 "nvme_iov_md": false 00:13:06.056 }, 00:13:06.056 "memory_domains": [ 00:13:06.056 { 00:13:06.056 "dma_device_id": "system", 00:13:06.056 "dma_device_type": 1 00:13:06.056 }, 00:13:06.056 { 00:13:06.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.056 "dma_device_type": 2 00:13:06.056 }, 00:13:06.056 { 00:13:06.056 "dma_device_id": "system", 00:13:06.056 "dma_device_type": 1 00:13:06.056 }, 00:13:06.056 { 00:13:06.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.056 "dma_device_type": 2 00:13:06.056 } 00:13:06.056 ], 00:13:06.056 "driver_specific": { 00:13:06.056 "raid": { 00:13:06.056 "uuid": "258ef3d6-8fcc-445d-98ee-67be77e84256", 00:13:06.056 "strip_size_kb": 0, 00:13:06.056 "state": "online", 00:13:06.056 "raid_level": "raid1", 00:13:06.056 "superblock": true, 00:13:06.056 "num_base_bdevs": 2, 00:13:06.056 "num_base_bdevs_discovered": 2, 00:13:06.056 "num_base_bdevs_operational": 2, 00:13:06.056 "base_bdevs_list": [ 00:13:06.056 { 00:13:06.056 "name": "BaseBdev1", 00:13:06.056 "uuid": "73c94f43-72bb-4922-bf85-ababe1e0723b", 00:13:06.056 "is_configured": true, 00:13:06.056 "data_offset": 2048, 00:13:06.056 "data_size": 63488 00:13:06.056 }, 00:13:06.056 { 00:13:06.056 "name": "BaseBdev2", 00:13:06.056 "uuid": "4ff4219f-7a80-4c3c-9a31-7c69c62bcaec", 00:13:06.056 "is_configured": true, 00:13:06.056 "data_offset": 2048, 00:13:06.056 "data_size": 63488 00:13:06.056 } 00:13:06.056 ] 00:13:06.056 } 00:13:06.056 } 00:13:06.056 }' 00:13:06.056 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:06.056 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:06.056 BaseBdev2' 00:13:06.056 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:06.056 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:06.056 22:41:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:06.315 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:06.315 "name": "BaseBdev1", 00:13:06.315 "aliases": [ 00:13:06.315 "73c94f43-72bb-4922-bf85-ababe1e0723b" 00:13:06.315 ], 00:13:06.315 "product_name": "Malloc disk", 00:13:06.315 "block_size": 512, 00:13:06.315 "num_blocks": 65536, 00:13:06.315 "uuid": "73c94f43-72bb-4922-bf85-ababe1e0723b", 00:13:06.315 "assigned_rate_limits": { 00:13:06.315 "rw_ios_per_sec": 0, 00:13:06.315 "rw_mbytes_per_sec": 0, 00:13:06.315 "r_mbytes_per_sec": 0, 00:13:06.315 "w_mbytes_per_sec": 0 00:13:06.315 }, 00:13:06.315 "claimed": true, 00:13:06.315 "claim_type": "exclusive_write", 00:13:06.315 "zoned": false, 00:13:06.315 "supported_io_types": { 00:13:06.315 "read": true, 00:13:06.315 "write": true, 00:13:06.315 "unmap": true, 00:13:06.315 "flush": true, 00:13:06.315 "reset": true, 00:13:06.315 "nvme_admin": false, 00:13:06.315 "nvme_io": false, 00:13:06.315 "nvme_io_md": false, 00:13:06.315 "write_zeroes": true, 00:13:06.315 "zcopy": true, 00:13:06.315 "get_zone_info": false, 00:13:06.315 "zone_management": false, 00:13:06.315 "zone_append": false, 00:13:06.315 "compare": false, 00:13:06.315 "compare_and_write": false, 00:13:06.315 "abort": true, 00:13:06.315 "seek_hole": false, 00:13:06.315 "seek_data": false, 00:13:06.315 "copy": true, 00:13:06.315 "nvme_iov_md": false 00:13:06.315 }, 00:13:06.315 "memory_domains": [ 00:13:06.315 { 00:13:06.315 "dma_device_id": "system", 00:13:06.315 "dma_device_type": 1 00:13:06.315 }, 00:13:06.315 { 00:13:06.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.315 "dma_device_type": 2 00:13:06.315 } 00:13:06.315 ], 00:13:06.315 "driver_specific": {} 00:13:06.315 }' 00:13:06.315 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.315 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.315 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:06.315 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.574 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.574 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:06.574 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.574 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.574 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:06.574 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.574 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.833 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:06.833 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:06.833 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:06.833 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:07.092 "name": "BaseBdev2", 00:13:07.092 "aliases": [ 00:13:07.092 "4ff4219f-7a80-4c3c-9a31-7c69c62bcaec" 00:13:07.092 ], 00:13:07.092 "product_name": "Malloc disk", 00:13:07.092 "block_size": 512, 00:13:07.092 "num_blocks": 65536, 00:13:07.092 "uuid": "4ff4219f-7a80-4c3c-9a31-7c69c62bcaec", 00:13:07.092 "assigned_rate_limits": { 00:13:07.092 "rw_ios_per_sec": 0, 00:13:07.092 "rw_mbytes_per_sec": 0, 00:13:07.092 "r_mbytes_per_sec": 0, 00:13:07.092 "w_mbytes_per_sec": 0 00:13:07.092 }, 00:13:07.092 "claimed": true, 00:13:07.092 "claim_type": "exclusive_write", 00:13:07.092 "zoned": false, 00:13:07.092 "supported_io_types": { 00:13:07.092 "read": true, 00:13:07.092 "write": true, 00:13:07.092 "unmap": true, 00:13:07.092 "flush": true, 00:13:07.092 "reset": true, 00:13:07.092 "nvme_admin": false, 00:13:07.092 "nvme_io": false, 00:13:07.092 "nvme_io_md": false, 00:13:07.092 "write_zeroes": true, 00:13:07.092 "zcopy": true, 00:13:07.092 "get_zone_info": false, 00:13:07.092 "zone_management": false, 00:13:07.092 "zone_append": false, 00:13:07.092 "compare": false, 00:13:07.092 "compare_and_write": false, 00:13:07.092 "abort": true, 00:13:07.092 "seek_hole": false, 00:13:07.092 "seek_data": false, 00:13:07.092 "copy": true, 00:13:07.092 "nvme_iov_md": false 00:13:07.092 }, 00:13:07.092 "memory_domains": [ 00:13:07.092 { 00:13:07.092 "dma_device_id": "system", 00:13:07.092 "dma_device_type": 1 00:13:07.092 }, 00:13:07.092 { 00:13:07.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:07.092 "dma_device_type": 2 00:13:07.092 } 00:13:07.092 ], 00:13:07.092 "driver_specific": {} 00:13:07.092 }' 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:07.092 22:41:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.351 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.351 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:07.351 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.351 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.352 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:07.352 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:07.611 [2024-07-15 22:41:52.344677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.611 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:07.869 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:07.869 "name": "Existed_Raid", 00:13:07.869 "uuid": "258ef3d6-8fcc-445d-98ee-67be77e84256", 00:13:07.869 "strip_size_kb": 0, 00:13:07.869 "state": "online", 00:13:07.869 "raid_level": "raid1", 00:13:07.869 "superblock": true, 00:13:07.869 "num_base_bdevs": 2, 00:13:07.869 "num_base_bdevs_discovered": 1, 00:13:07.869 "num_base_bdevs_operational": 1, 00:13:07.869 "base_bdevs_list": [ 00:13:07.869 { 00:13:07.869 "name": null, 00:13:07.869 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.869 "is_configured": false, 00:13:07.869 "data_offset": 2048, 00:13:07.869 "data_size": 63488 00:13:07.869 }, 00:13:07.869 { 00:13:07.869 "name": "BaseBdev2", 00:13:07.869 "uuid": "4ff4219f-7a80-4c3c-9a31-7c69c62bcaec", 00:13:07.869 "is_configured": true, 00:13:07.869 "data_offset": 2048, 00:13:07.869 "data_size": 63488 00:13:07.869 } 00:13:07.869 ] 00:13:07.869 }' 00:13:07.869 22:41:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:07.869 22:41:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:08.437 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:08.437 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:08.437 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.437 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:08.695 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:08.695 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:08.695 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:08.954 [2024-07-15 22:41:53.686164] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:08.954 [2024-07-15 22:41:53.686248] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:08.954 [2024-07-15 22:41:53.697208] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:08.954 [2024-07-15 22:41:53.697245] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:08.954 [2024-07-15 22:41:53.697257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2614000 name Existed_Raid, state offline 00:13:08.954 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:08.954 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:08.954 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.954 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2712749 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2712749 ']' 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2712749 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:09.214 22:41:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2712749 00:13:09.214 22:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:09.214 22:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:09.214 22:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2712749' 00:13:09.214 killing process with pid 2712749 00:13:09.214 22:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2712749 00:13:09.214 [2024-07-15 22:41:54.018290] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:09.214 22:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2712749 00:13:09.214 [2024-07-15 22:41:54.019272] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:09.473 22:41:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:09.473 00:13:09.473 real 0m10.448s 00:13:09.473 user 0m18.628s 00:13:09.473 sys 0m1.900s 00:13:09.473 22:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:09.473 22:41:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.473 ************************************ 00:13:09.473 END TEST raid_state_function_test_sb 00:13:09.473 ************************************ 00:13:09.473 22:41:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:09.473 22:41:54 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:13:09.473 22:41:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:09.473 22:41:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:09.473 22:41:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:09.473 ************************************ 00:13:09.473 START TEST raid_superblock_test 00:13:09.473 ************************************ 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2714377 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2714377 /var/tmp/spdk-raid.sock 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:09.473 22:41:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2714377 ']' 00:13:09.474 22:41:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:09.474 22:41:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:09.474 22:41:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:09.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:09.474 22:41:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:09.474 22:41:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.733 [2024-07-15 22:41:54.384232] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:13:09.733 [2024-07-15 22:41:54.384297] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2714377 ] 00:13:09.733 [2024-07-15 22:41:54.511436] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:09.733 [2024-07-15 22:41:54.617197] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:09.993 [2024-07-15 22:41:54.680356] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:09.993 [2024-07-15 22:41:54.680388] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:10.561 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:10.820 malloc1 00:13:10.820 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:11.079 [2024-07-15 22:41:55.743231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:11.079 [2024-07-15 22:41:55.743280] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:11.079 [2024-07-15 22:41:55.743301] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151a570 00:13:11.079 [2024-07-15 22:41:55.743313] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:11.079 [2024-07-15 22:41:55.745052] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:11.079 [2024-07-15 22:41:55.745080] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:11.079 pt1 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:11.079 22:41:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:11.339 malloc2 00:13:11.339 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:11.339 [2024-07-15 22:41:56.230444] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:11.339 [2024-07-15 22:41:56.230488] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:11.339 [2024-07-15 22:41:56.230505] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151b970 00:13:11.339 [2024-07-15 22:41:56.230518] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:11.339 [2024-07-15 22:41:56.232137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:11.339 [2024-07-15 22:41:56.232165] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:11.339 pt2 00:13:11.622 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:11.622 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:11.622 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:11.622 [2024-07-15 22:41:56.463075] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:11.622 [2024-07-15 22:41:56.464553] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:11.622 [2024-07-15 22:41:56.464702] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16be270 00:13:11.622 [2024-07-15 22:41:56.464715] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:11.622 [2024-07-15 22:41:56.464917] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15120e0 00:13:11.622 [2024-07-15 22:41:56.465071] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16be270 00:13:11.622 [2024-07-15 22:41:56.465081] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16be270 00:13:11.622 [2024-07-15 22:41:56.465180] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:11.622 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:11.622 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:11.622 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.623 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:11.882 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.882 "name": "raid_bdev1", 00:13:11.882 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:11.882 "strip_size_kb": 0, 00:13:11.882 "state": "online", 00:13:11.882 "raid_level": "raid1", 00:13:11.882 "superblock": true, 00:13:11.882 "num_base_bdevs": 2, 00:13:11.882 "num_base_bdevs_discovered": 2, 00:13:11.882 "num_base_bdevs_operational": 2, 00:13:11.882 "base_bdevs_list": [ 00:13:11.882 { 00:13:11.882 "name": "pt1", 00:13:11.882 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:11.882 "is_configured": true, 00:13:11.882 "data_offset": 2048, 00:13:11.882 "data_size": 63488 00:13:11.882 }, 00:13:11.882 { 00:13:11.882 "name": "pt2", 00:13:11.882 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:11.882 "is_configured": true, 00:13:11.882 "data_offset": 2048, 00:13:11.882 "data_size": 63488 00:13:11.882 } 00:13:11.882 ] 00:13:11.882 }' 00:13:11.882 22:41:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.882 22:41:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:12.450 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:12.710 [2024-07-15 22:41:57.489989] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:12.710 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:12.710 "name": "raid_bdev1", 00:13:12.710 "aliases": [ 00:13:12.710 "f4eba17e-6341-4c86-aa69-996befa6749d" 00:13:12.710 ], 00:13:12.710 "product_name": "Raid Volume", 00:13:12.710 "block_size": 512, 00:13:12.710 "num_blocks": 63488, 00:13:12.710 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:12.710 "assigned_rate_limits": { 00:13:12.710 "rw_ios_per_sec": 0, 00:13:12.710 "rw_mbytes_per_sec": 0, 00:13:12.710 "r_mbytes_per_sec": 0, 00:13:12.710 "w_mbytes_per_sec": 0 00:13:12.710 }, 00:13:12.710 "claimed": false, 00:13:12.710 "zoned": false, 00:13:12.710 "supported_io_types": { 00:13:12.710 "read": true, 00:13:12.710 "write": true, 00:13:12.710 "unmap": false, 00:13:12.710 "flush": false, 00:13:12.710 "reset": true, 00:13:12.710 "nvme_admin": false, 00:13:12.710 "nvme_io": false, 00:13:12.710 "nvme_io_md": false, 00:13:12.710 "write_zeroes": true, 00:13:12.710 "zcopy": false, 00:13:12.710 "get_zone_info": false, 00:13:12.710 "zone_management": false, 00:13:12.710 "zone_append": false, 00:13:12.710 "compare": false, 00:13:12.710 "compare_and_write": false, 00:13:12.710 "abort": false, 00:13:12.710 "seek_hole": false, 00:13:12.710 "seek_data": false, 00:13:12.710 "copy": false, 00:13:12.710 "nvme_iov_md": false 00:13:12.710 }, 00:13:12.710 "memory_domains": [ 00:13:12.710 { 00:13:12.710 "dma_device_id": "system", 00:13:12.710 "dma_device_type": 1 00:13:12.710 }, 00:13:12.710 { 00:13:12.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.710 "dma_device_type": 2 00:13:12.710 }, 00:13:12.710 { 00:13:12.710 "dma_device_id": "system", 00:13:12.710 "dma_device_type": 1 00:13:12.710 }, 00:13:12.710 { 00:13:12.710 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.710 "dma_device_type": 2 00:13:12.710 } 00:13:12.710 ], 00:13:12.710 "driver_specific": { 00:13:12.710 "raid": { 00:13:12.710 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:12.710 "strip_size_kb": 0, 00:13:12.710 "state": "online", 00:13:12.710 "raid_level": "raid1", 00:13:12.710 "superblock": true, 00:13:12.710 "num_base_bdevs": 2, 00:13:12.710 "num_base_bdevs_discovered": 2, 00:13:12.710 "num_base_bdevs_operational": 2, 00:13:12.710 "base_bdevs_list": [ 00:13:12.710 { 00:13:12.710 "name": "pt1", 00:13:12.710 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:12.710 "is_configured": true, 00:13:12.710 "data_offset": 2048, 00:13:12.710 "data_size": 63488 00:13:12.710 }, 00:13:12.710 { 00:13:12.710 "name": "pt2", 00:13:12.710 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:12.710 "is_configured": true, 00:13:12.710 "data_offset": 2048, 00:13:12.710 "data_size": 63488 00:13:12.710 } 00:13:12.710 ] 00:13:12.710 } 00:13:12.710 } 00:13:12.710 }' 00:13:12.710 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:12.710 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:12.710 pt2' 00:13:12.710 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:12.710 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:12.710 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:12.970 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:12.970 "name": "pt1", 00:13:12.970 "aliases": [ 00:13:12.970 "00000000-0000-0000-0000-000000000001" 00:13:12.970 ], 00:13:12.970 "product_name": "passthru", 00:13:12.970 "block_size": 512, 00:13:12.970 "num_blocks": 65536, 00:13:12.970 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:12.970 "assigned_rate_limits": { 00:13:12.970 "rw_ios_per_sec": 0, 00:13:12.970 "rw_mbytes_per_sec": 0, 00:13:12.970 "r_mbytes_per_sec": 0, 00:13:12.970 "w_mbytes_per_sec": 0 00:13:12.970 }, 00:13:12.970 "claimed": true, 00:13:12.970 "claim_type": "exclusive_write", 00:13:12.970 "zoned": false, 00:13:12.970 "supported_io_types": { 00:13:12.970 "read": true, 00:13:12.970 "write": true, 00:13:12.970 "unmap": true, 00:13:12.970 "flush": true, 00:13:12.970 "reset": true, 00:13:12.970 "nvme_admin": false, 00:13:12.970 "nvme_io": false, 00:13:12.970 "nvme_io_md": false, 00:13:12.970 "write_zeroes": true, 00:13:12.970 "zcopy": true, 00:13:12.970 "get_zone_info": false, 00:13:12.970 "zone_management": false, 00:13:12.970 "zone_append": false, 00:13:12.970 "compare": false, 00:13:12.970 "compare_and_write": false, 00:13:12.970 "abort": true, 00:13:12.970 "seek_hole": false, 00:13:12.970 "seek_data": false, 00:13:12.970 "copy": true, 00:13:12.970 "nvme_iov_md": false 00:13:12.970 }, 00:13:12.970 "memory_domains": [ 00:13:12.970 { 00:13:12.970 "dma_device_id": "system", 00:13:12.970 "dma_device_type": 1 00:13:12.970 }, 00:13:12.970 { 00:13:12.970 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:12.970 "dma_device_type": 2 00:13:12.970 } 00:13:12.970 ], 00:13:12.970 "driver_specific": { 00:13:12.970 "passthru": { 00:13:12.970 "name": "pt1", 00:13:12.970 "base_bdev_name": "malloc1" 00:13:12.970 } 00:13:12.970 } 00:13:12.970 }' 00:13:12.970 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:12.970 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.230 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:13.230 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.230 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.230 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:13.230 22:41:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:13.230 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:13.230 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:13.230 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:13.490 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:13.490 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:13.490 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:13.490 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:13.490 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:13.749 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:13.749 "name": "pt2", 00:13:13.749 "aliases": [ 00:13:13.749 "00000000-0000-0000-0000-000000000002" 00:13:13.749 ], 00:13:13.749 "product_name": "passthru", 00:13:13.749 "block_size": 512, 00:13:13.749 "num_blocks": 65536, 00:13:13.749 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:13.749 "assigned_rate_limits": { 00:13:13.749 "rw_ios_per_sec": 0, 00:13:13.749 "rw_mbytes_per_sec": 0, 00:13:13.749 "r_mbytes_per_sec": 0, 00:13:13.749 "w_mbytes_per_sec": 0 00:13:13.749 }, 00:13:13.749 "claimed": true, 00:13:13.749 "claim_type": "exclusive_write", 00:13:13.749 "zoned": false, 00:13:13.749 "supported_io_types": { 00:13:13.749 "read": true, 00:13:13.749 "write": true, 00:13:13.749 "unmap": true, 00:13:13.749 "flush": true, 00:13:13.749 "reset": true, 00:13:13.749 "nvme_admin": false, 00:13:13.749 "nvme_io": false, 00:13:13.749 "nvme_io_md": false, 00:13:13.749 "write_zeroes": true, 00:13:13.749 "zcopy": true, 00:13:13.749 "get_zone_info": false, 00:13:13.749 "zone_management": false, 00:13:13.749 "zone_append": false, 00:13:13.749 "compare": false, 00:13:13.749 "compare_and_write": false, 00:13:13.749 "abort": true, 00:13:13.749 "seek_hole": false, 00:13:13.749 "seek_data": false, 00:13:13.749 "copy": true, 00:13:13.749 "nvme_iov_md": false 00:13:13.749 }, 00:13:13.749 "memory_domains": [ 00:13:13.749 { 00:13:13.749 "dma_device_id": "system", 00:13:13.749 "dma_device_type": 1 00:13:13.749 }, 00:13:13.749 { 00:13:13.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.749 "dma_device_type": 2 00:13:13.749 } 00:13:13.749 ], 00:13:13.750 "driver_specific": { 00:13:13.750 "passthru": { 00:13:13.750 "name": "pt2", 00:13:13.750 "base_bdev_name": "malloc2" 00:13:13.750 } 00:13:13.750 } 00:13:13.750 }' 00:13:13.750 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.750 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:13.750 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:13.750 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.750 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:13.750 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:14.009 22:41:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:14.268 [2024-07-15 22:41:58.985942] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.268 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f4eba17e-6341-4c86-aa69-996befa6749d 00:13:14.268 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z f4eba17e-6341-4c86-aa69-996befa6749d ']' 00:13:14.268 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:14.527 [2024-07-15 22:41:59.238368] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:14.527 [2024-07-15 22:41:59.238388] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:14.527 [2024-07-15 22:41:59.238442] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:14.527 [2024-07-15 22:41:59.238498] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:14.527 [2024-07-15 22:41:59.238511] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16be270 name raid_bdev1, state offline 00:13:14.527 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.527 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:14.787 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:14.787 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:14.787 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:14.787 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:15.047 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:15.047 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:15.306 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:15.306 22:41:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:15.566 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:15.825 [2024-07-15 22:42:00.726252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:15.825 [2024-07-15 22:42:00.727648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:15.825 [2024-07-15 22:42:00.727709] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:15.825 [2024-07-15 22:42:00.727751] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:15.825 [2024-07-15 22:42:00.727770] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:15.825 [2024-07-15 22:42:00.727781] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16bdff0 name raid_bdev1, state configuring 00:13:15.825 request: 00:13:15.825 { 00:13:15.825 "name": "raid_bdev1", 00:13:15.825 "raid_level": "raid1", 00:13:15.825 "base_bdevs": [ 00:13:15.825 "malloc1", 00:13:15.825 "malloc2" 00:13:15.825 ], 00:13:15.825 "superblock": false, 00:13:15.825 "method": "bdev_raid_create", 00:13:15.825 "req_id": 1 00:13:15.825 } 00:13:15.825 Got JSON-RPC error response 00:13:15.825 response: 00:13:15.825 { 00:13:15.825 "code": -17, 00:13:15.825 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:15.825 } 00:13:16.085 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:16.085 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:16.085 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:16.085 22:42:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:16.085 22:42:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.085 22:42:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:16.343 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:16.343 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:16.343 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:16.602 [2024-07-15 22:42:01.279632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:16.602 [2024-07-15 22:42:01.279670] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:16.602 [2024-07-15 22:42:01.279690] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151a7a0 00:13:16.602 [2024-07-15 22:42:01.279702] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:16.602 [2024-07-15 22:42:01.281311] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:16.602 [2024-07-15 22:42:01.281338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:16.602 [2024-07-15 22:42:01.281401] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:16.602 [2024-07-15 22:42:01.281427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:16.602 pt1 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.602 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:16.921 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.921 "name": "raid_bdev1", 00:13:16.921 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:16.921 "strip_size_kb": 0, 00:13:16.921 "state": "configuring", 00:13:16.921 "raid_level": "raid1", 00:13:16.921 "superblock": true, 00:13:16.921 "num_base_bdevs": 2, 00:13:16.921 "num_base_bdevs_discovered": 1, 00:13:16.921 "num_base_bdevs_operational": 2, 00:13:16.921 "base_bdevs_list": [ 00:13:16.921 { 00:13:16.921 "name": "pt1", 00:13:16.921 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:16.921 "is_configured": true, 00:13:16.921 "data_offset": 2048, 00:13:16.921 "data_size": 63488 00:13:16.921 }, 00:13:16.921 { 00:13:16.921 "name": null, 00:13:16.921 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:16.921 "is_configured": false, 00:13:16.921 "data_offset": 2048, 00:13:16.921 "data_size": 63488 00:13:16.921 } 00:13:16.921 ] 00:13:16.921 }' 00:13:16.921 22:42:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.921 22:42:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:17.509 [2024-07-15 22:42:02.390583] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:17.509 [2024-07-15 22:42:02.390633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:17.509 [2024-07-15 22:42:02.390652] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16b26f0 00:13:17.509 [2024-07-15 22:42:02.390665] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:17.509 [2024-07-15 22:42:02.391039] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:17.509 [2024-07-15 22:42:02.391058] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:17.509 [2024-07-15 22:42:02.391125] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:17.509 [2024-07-15 22:42:02.391145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:17.509 [2024-07-15 22:42:02.391246] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16b3590 00:13:17.509 [2024-07-15 22:42:02.391256] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:17.509 [2024-07-15 22:42:02.391421] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1514540 00:13:17.509 [2024-07-15 22:42:02.391546] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16b3590 00:13:17.509 [2024-07-15 22:42:02.391556] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16b3590 00:13:17.509 [2024-07-15 22:42:02.391651] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:17.509 pt2 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.509 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:17.767 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.767 "name": "raid_bdev1", 00:13:17.767 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:17.767 "strip_size_kb": 0, 00:13:17.767 "state": "online", 00:13:17.767 "raid_level": "raid1", 00:13:17.767 "superblock": true, 00:13:17.767 "num_base_bdevs": 2, 00:13:17.767 "num_base_bdevs_discovered": 2, 00:13:17.767 "num_base_bdevs_operational": 2, 00:13:17.767 "base_bdevs_list": [ 00:13:17.767 { 00:13:17.767 "name": "pt1", 00:13:17.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:17.767 "is_configured": true, 00:13:17.767 "data_offset": 2048, 00:13:17.767 "data_size": 63488 00:13:17.767 }, 00:13:17.767 { 00:13:17.767 "name": "pt2", 00:13:17.767 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:17.767 "is_configured": true, 00:13:17.767 "data_offset": 2048, 00:13:17.767 "data_size": 63488 00:13:17.767 } 00:13:17.767 ] 00:13:17.767 }' 00:13:17.767 22:42:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.767 22:42:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:18.700 [2024-07-15 22:42:03.461678] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:18.700 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:18.700 "name": "raid_bdev1", 00:13:18.700 "aliases": [ 00:13:18.700 "f4eba17e-6341-4c86-aa69-996befa6749d" 00:13:18.700 ], 00:13:18.700 "product_name": "Raid Volume", 00:13:18.700 "block_size": 512, 00:13:18.700 "num_blocks": 63488, 00:13:18.700 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:18.700 "assigned_rate_limits": { 00:13:18.700 "rw_ios_per_sec": 0, 00:13:18.700 "rw_mbytes_per_sec": 0, 00:13:18.700 "r_mbytes_per_sec": 0, 00:13:18.700 "w_mbytes_per_sec": 0 00:13:18.700 }, 00:13:18.700 "claimed": false, 00:13:18.700 "zoned": false, 00:13:18.700 "supported_io_types": { 00:13:18.700 "read": true, 00:13:18.700 "write": true, 00:13:18.700 "unmap": false, 00:13:18.700 "flush": false, 00:13:18.700 "reset": true, 00:13:18.700 "nvme_admin": false, 00:13:18.700 "nvme_io": false, 00:13:18.700 "nvme_io_md": false, 00:13:18.700 "write_zeroes": true, 00:13:18.701 "zcopy": false, 00:13:18.701 "get_zone_info": false, 00:13:18.701 "zone_management": false, 00:13:18.701 "zone_append": false, 00:13:18.701 "compare": false, 00:13:18.701 "compare_and_write": false, 00:13:18.701 "abort": false, 00:13:18.701 "seek_hole": false, 00:13:18.701 "seek_data": false, 00:13:18.701 "copy": false, 00:13:18.701 "nvme_iov_md": false 00:13:18.701 }, 00:13:18.701 "memory_domains": [ 00:13:18.701 { 00:13:18.701 "dma_device_id": "system", 00:13:18.701 "dma_device_type": 1 00:13:18.701 }, 00:13:18.701 { 00:13:18.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.701 "dma_device_type": 2 00:13:18.701 }, 00:13:18.701 { 00:13:18.701 "dma_device_id": "system", 00:13:18.701 "dma_device_type": 1 00:13:18.701 }, 00:13:18.701 { 00:13:18.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.701 "dma_device_type": 2 00:13:18.701 } 00:13:18.701 ], 00:13:18.701 "driver_specific": { 00:13:18.701 "raid": { 00:13:18.701 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:18.701 "strip_size_kb": 0, 00:13:18.701 "state": "online", 00:13:18.701 "raid_level": "raid1", 00:13:18.701 "superblock": true, 00:13:18.701 "num_base_bdevs": 2, 00:13:18.701 "num_base_bdevs_discovered": 2, 00:13:18.701 "num_base_bdevs_operational": 2, 00:13:18.701 "base_bdevs_list": [ 00:13:18.701 { 00:13:18.701 "name": "pt1", 00:13:18.701 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:18.701 "is_configured": true, 00:13:18.701 "data_offset": 2048, 00:13:18.701 "data_size": 63488 00:13:18.701 }, 00:13:18.701 { 00:13:18.701 "name": "pt2", 00:13:18.701 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:18.701 "is_configured": true, 00:13:18.701 "data_offset": 2048, 00:13:18.701 "data_size": 63488 00:13:18.701 } 00:13:18.701 ] 00:13:18.701 } 00:13:18.701 } 00:13:18.701 }' 00:13:18.701 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:18.701 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:18.701 pt2' 00:13:18.701 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.701 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:18.701 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.959 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.959 "name": "pt1", 00:13:18.959 "aliases": [ 00:13:18.959 "00000000-0000-0000-0000-000000000001" 00:13:18.959 ], 00:13:18.959 "product_name": "passthru", 00:13:18.959 "block_size": 512, 00:13:18.959 "num_blocks": 65536, 00:13:18.959 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:18.959 "assigned_rate_limits": { 00:13:18.959 "rw_ios_per_sec": 0, 00:13:18.959 "rw_mbytes_per_sec": 0, 00:13:18.959 "r_mbytes_per_sec": 0, 00:13:18.959 "w_mbytes_per_sec": 0 00:13:18.959 }, 00:13:18.959 "claimed": true, 00:13:18.959 "claim_type": "exclusive_write", 00:13:18.959 "zoned": false, 00:13:18.959 "supported_io_types": { 00:13:18.959 "read": true, 00:13:18.959 "write": true, 00:13:18.959 "unmap": true, 00:13:18.959 "flush": true, 00:13:18.959 "reset": true, 00:13:18.959 "nvme_admin": false, 00:13:18.959 "nvme_io": false, 00:13:18.959 "nvme_io_md": false, 00:13:18.959 "write_zeroes": true, 00:13:18.959 "zcopy": true, 00:13:18.959 "get_zone_info": false, 00:13:18.959 "zone_management": false, 00:13:18.959 "zone_append": false, 00:13:18.959 "compare": false, 00:13:18.959 "compare_and_write": false, 00:13:18.959 "abort": true, 00:13:18.959 "seek_hole": false, 00:13:18.959 "seek_data": false, 00:13:18.959 "copy": true, 00:13:18.959 "nvme_iov_md": false 00:13:18.959 }, 00:13:18.959 "memory_domains": [ 00:13:18.959 { 00:13:18.959 "dma_device_id": "system", 00:13:18.959 "dma_device_type": 1 00:13:18.959 }, 00:13:18.959 { 00:13:18.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.959 "dma_device_type": 2 00:13:18.959 } 00:13:18.959 ], 00:13:18.959 "driver_specific": { 00:13:18.959 "passthru": { 00:13:18.959 "name": "pt1", 00:13:18.959 "base_bdev_name": "malloc1" 00:13:18.959 } 00:13:18.959 } 00:13:18.959 }' 00:13:18.959 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.959 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.216 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:19.216 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.216 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.216 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:19.216 22:42:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.216 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.216 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:19.216 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.216 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.474 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:19.474 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:19.474 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:19.474 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:19.474 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:19.474 "name": "pt2", 00:13:19.474 "aliases": [ 00:13:19.474 "00000000-0000-0000-0000-000000000002" 00:13:19.474 ], 00:13:19.474 "product_name": "passthru", 00:13:19.474 "block_size": 512, 00:13:19.474 "num_blocks": 65536, 00:13:19.474 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:19.474 "assigned_rate_limits": { 00:13:19.474 "rw_ios_per_sec": 0, 00:13:19.474 "rw_mbytes_per_sec": 0, 00:13:19.474 "r_mbytes_per_sec": 0, 00:13:19.474 "w_mbytes_per_sec": 0 00:13:19.474 }, 00:13:19.474 "claimed": true, 00:13:19.474 "claim_type": "exclusive_write", 00:13:19.474 "zoned": false, 00:13:19.474 "supported_io_types": { 00:13:19.474 "read": true, 00:13:19.474 "write": true, 00:13:19.474 "unmap": true, 00:13:19.474 "flush": true, 00:13:19.474 "reset": true, 00:13:19.474 "nvme_admin": false, 00:13:19.474 "nvme_io": false, 00:13:19.474 "nvme_io_md": false, 00:13:19.474 "write_zeroes": true, 00:13:19.474 "zcopy": true, 00:13:19.474 "get_zone_info": false, 00:13:19.474 "zone_management": false, 00:13:19.474 "zone_append": false, 00:13:19.474 "compare": false, 00:13:19.474 "compare_and_write": false, 00:13:19.474 "abort": true, 00:13:19.474 "seek_hole": false, 00:13:19.474 "seek_data": false, 00:13:19.474 "copy": true, 00:13:19.474 "nvme_iov_md": false 00:13:19.474 }, 00:13:19.474 "memory_domains": [ 00:13:19.474 { 00:13:19.474 "dma_device_id": "system", 00:13:19.474 "dma_device_type": 1 00:13:19.474 }, 00:13:19.474 { 00:13:19.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:19.474 "dma_device_type": 2 00:13:19.474 } 00:13:19.474 ], 00:13:19.474 "driver_specific": { 00:13:19.474 "passthru": { 00:13:19.474 "name": "pt2", 00:13:19.474 "base_bdev_name": "malloc2" 00:13:19.474 } 00:13:19.474 } 00:13:19.474 }' 00:13:19.474 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.474 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.732 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:20.006 [2024-07-15 22:42:04.825379] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' f4eba17e-6341-4c86-aa69-996befa6749d '!=' f4eba17e-6341-4c86-aa69-996befa6749d ']' 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:20.006 22:42:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:20.264 [2024-07-15 22:42:05.077818] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.264 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:20.831 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.831 "name": "raid_bdev1", 00:13:20.831 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:20.831 "strip_size_kb": 0, 00:13:20.831 "state": "online", 00:13:20.831 "raid_level": "raid1", 00:13:20.831 "superblock": true, 00:13:20.831 "num_base_bdevs": 2, 00:13:20.831 "num_base_bdevs_discovered": 1, 00:13:20.831 "num_base_bdevs_operational": 1, 00:13:20.831 "base_bdevs_list": [ 00:13:20.831 { 00:13:20.831 "name": null, 00:13:20.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.831 "is_configured": false, 00:13:20.831 "data_offset": 2048, 00:13:20.831 "data_size": 63488 00:13:20.831 }, 00:13:20.831 { 00:13:20.831 "name": "pt2", 00:13:20.831 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:20.831 "is_configured": true, 00:13:20.831 "data_offset": 2048, 00:13:20.831 "data_size": 63488 00:13:20.831 } 00:13:20.831 ] 00:13:20.831 }' 00:13:20.831 22:42:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.831 22:42:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.398 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:21.656 [2024-07-15 22:42:06.473498] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:21.656 [2024-07-15 22:42:06.473524] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:21.656 [2024-07-15 22:42:06.473576] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:21.656 [2024-07-15 22:42:06.473616] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:21.656 [2024-07-15 22:42:06.473628] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16b3590 name raid_bdev1, state offline 00:13:21.656 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:13:21.656 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.915 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:13:21.915 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:13:21.915 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:13:21.915 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:21.915 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:22.174 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:22.174 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:22.174 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:13:22.174 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:22.174 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:13:22.174 22:42:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:22.432 [2024-07-15 22:42:07.219436] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:22.432 [2024-07-15 22:42:07.219474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:22.432 [2024-07-15 22:42:07.219490] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151b160 00:13:22.432 [2024-07-15 22:42:07.219502] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:22.432 [2024-07-15 22:42:07.221096] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:22.432 [2024-07-15 22:42:07.221123] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:22.432 [2024-07-15 22:42:07.221187] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:22.432 [2024-07-15 22:42:07.221219] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:22.432 [2024-07-15 22:42:07.221298] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1511380 00:13:22.432 [2024-07-15 22:42:07.221309] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:22.432 [2024-07-15 22:42:07.221477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1512a80 00:13:22.432 [2024-07-15 22:42:07.221596] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1511380 00:13:22.432 [2024-07-15 22:42:07.221606] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1511380 00:13:22.432 [2024-07-15 22:42:07.221696] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:22.432 pt2 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.432 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.433 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.433 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.433 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.433 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:22.691 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.691 "name": "raid_bdev1", 00:13:22.691 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:22.691 "strip_size_kb": 0, 00:13:22.691 "state": "online", 00:13:22.691 "raid_level": "raid1", 00:13:22.691 "superblock": true, 00:13:22.691 "num_base_bdevs": 2, 00:13:22.691 "num_base_bdevs_discovered": 1, 00:13:22.691 "num_base_bdevs_operational": 1, 00:13:22.691 "base_bdevs_list": [ 00:13:22.691 { 00:13:22.691 "name": null, 00:13:22.691 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:22.691 "is_configured": false, 00:13:22.691 "data_offset": 2048, 00:13:22.691 "data_size": 63488 00:13:22.691 }, 00:13:22.691 { 00:13:22.691 "name": "pt2", 00:13:22.691 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:22.691 "is_configured": true, 00:13:22.691 "data_offset": 2048, 00:13:22.691 "data_size": 63488 00:13:22.691 } 00:13:22.691 ] 00:13:22.691 }' 00:13:22.691 22:42:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.691 22:42:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.259 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:23.517 [2024-07-15 22:42:08.318410] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:23.517 [2024-07-15 22:42:08.318432] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:23.517 [2024-07-15 22:42:08.318483] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:23.517 [2024-07-15 22:42:08.318526] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:23.517 [2024-07-15 22:42:08.318537] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1511380 name raid_bdev1, state offline 00:13:23.517 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.517 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:13:23.775 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:13:23.775 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:13:23.775 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:13:23.775 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:23.775 [2024-07-15 22:42:08.683362] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:23.775 [2024-07-15 22:42:08.683406] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:23.775 [2024-07-15 22:42:08.683423] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16bd520 00:13:23.775 [2024-07-15 22:42:08.683436] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:24.034 [2024-07-15 22:42:08.685020] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:24.034 [2024-07-15 22:42:08.685046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:24.034 [2024-07-15 22:42:08.685109] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:24.034 [2024-07-15 22:42:08.685135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:24.034 [2024-07-15 22:42:08.685226] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:24.034 [2024-07-15 22:42:08.685239] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:24.034 [2024-07-15 22:42:08.685252] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15123f0 name raid_bdev1, state configuring 00:13:24.034 [2024-07-15 22:42:08.685273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:24.034 [2024-07-15 22:42:08.685328] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15142b0 00:13:24.034 [2024-07-15 22:42:08.685339] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:24.035 [2024-07-15 22:42:08.685494] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1511350 00:13:24.035 [2024-07-15 22:42:08.685613] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15142b0 00:13:24.035 [2024-07-15 22:42:08.685623] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15142b0 00:13:24.035 [2024-07-15 22:42:08.685718] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:24.035 pt1 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.035 "name": "raid_bdev1", 00:13:24.035 "uuid": "f4eba17e-6341-4c86-aa69-996befa6749d", 00:13:24.035 "strip_size_kb": 0, 00:13:24.035 "state": "online", 00:13:24.035 "raid_level": "raid1", 00:13:24.035 "superblock": true, 00:13:24.035 "num_base_bdevs": 2, 00:13:24.035 "num_base_bdevs_discovered": 1, 00:13:24.035 "num_base_bdevs_operational": 1, 00:13:24.035 "base_bdevs_list": [ 00:13:24.035 { 00:13:24.035 "name": null, 00:13:24.035 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.035 "is_configured": false, 00:13:24.035 "data_offset": 2048, 00:13:24.035 "data_size": 63488 00:13:24.035 }, 00:13:24.035 { 00:13:24.035 "name": "pt2", 00:13:24.035 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:24.035 "is_configured": true, 00:13:24.035 "data_offset": 2048, 00:13:24.035 "data_size": 63488 00:13:24.035 } 00:13:24.035 ] 00:13:24.035 }' 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.035 22:42:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:24.971 22:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:24.971 22:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:24.971 22:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:13:24.971 22:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:24.971 22:42:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:13:25.230 [2024-07-15 22:42:10.087323] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:25.230 22:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' f4eba17e-6341-4c86-aa69-996befa6749d '!=' f4eba17e-6341-4c86-aa69-996befa6749d ']' 00:13:25.230 22:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2714377 00:13:25.230 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2714377 ']' 00:13:25.230 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2714377 00:13:25.230 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:25.230 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:25.230 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2714377 00:13:25.490 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:25.490 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:25.490 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2714377' 00:13:25.490 killing process with pid 2714377 00:13:25.490 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2714377 00:13:25.490 [2024-07-15 22:42:10.178051] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:25.490 [2024-07-15 22:42:10.178104] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:25.490 [2024-07-15 22:42:10.178146] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:25.490 [2024-07-15 22:42:10.178158] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15142b0 name raid_bdev1, state offline 00:13:25.490 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2714377 00:13:25.490 [2024-07-15 22:42:10.196455] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:25.749 22:42:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:25.749 00:13:25.749 real 0m16.097s 00:13:25.749 user 0m29.217s 00:13:25.749 sys 0m2.937s 00:13:25.749 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:25.749 22:42:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.749 ************************************ 00:13:25.749 END TEST raid_superblock_test 00:13:25.749 ************************************ 00:13:25.749 22:42:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:25.749 22:42:10 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:25.749 22:42:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:25.749 22:42:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:25.749 22:42:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:25.749 ************************************ 00:13:25.749 START TEST raid_read_error_test 00:13:25.749 ************************************ 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.I73HXbB9ji 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2717313 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2717313 /var/tmp/spdk-raid.sock 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2717313 ']' 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:25.749 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:25.749 22:42:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.749 [2024-07-15 22:42:10.569958] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:13:25.749 [2024-07-15 22:42:10.570021] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2717313 ] 00:13:26.008 [2024-07-15 22:42:10.697988] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:26.008 [2024-07-15 22:42:10.802198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:26.008 [2024-07-15 22:42:10.859932] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:26.008 [2024-07-15 22:42:10.859969] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:26.267 22:42:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:26.267 22:42:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:26.267 22:42:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:26.267 22:42:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:26.526 BaseBdev1_malloc 00:13:26.526 22:42:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:27.095 true 00:13:27.095 22:42:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:27.354 [2024-07-15 22:42:12.119588] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:27.354 [2024-07-15 22:42:12.119634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:27.354 [2024-07-15 22:42:12.119655] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x98e0d0 00:13:27.354 [2024-07-15 22:42:12.119668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:27.354 [2024-07-15 22:42:12.121556] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:27.354 [2024-07-15 22:42:12.121585] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:27.354 BaseBdev1 00:13:27.354 22:42:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:27.354 22:42:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:27.922 BaseBdev2_malloc 00:13:27.922 22:42:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:28.181 true 00:13:28.181 22:42:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:28.440 [2024-07-15 22:42:13.231194] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:28.440 [2024-07-15 22:42:13.231236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.440 [2024-07-15 22:42:13.231257] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x992910 00:13:28.440 [2024-07-15 22:42:13.231270] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.440 [2024-07-15 22:42:13.232839] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.440 [2024-07-15 22:42:13.232865] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:28.440 BaseBdev2 00:13:28.440 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:28.698 [2024-07-15 22:42:13.528003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:28.698 [2024-07-15 22:42:13.529374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:28.698 [2024-07-15 22:42:13.529569] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x994320 00:13:28.698 [2024-07-15 22:42:13.529582] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:28.698 [2024-07-15 22:42:13.529773] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7fbd00 00:13:28.698 [2024-07-15 22:42:13.529924] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x994320 00:13:28.698 [2024-07-15 22:42:13.529950] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x994320 00:13:28.698 [2024-07-15 22:42:13.530065] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.698 22:42:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:29.264 22:42:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.264 "name": "raid_bdev1", 00:13:29.264 "uuid": "41932895-ec6d-42f1-b1f1-f8b67282886c", 00:13:29.264 "strip_size_kb": 0, 00:13:29.264 "state": "online", 00:13:29.264 "raid_level": "raid1", 00:13:29.264 "superblock": true, 00:13:29.264 "num_base_bdevs": 2, 00:13:29.264 "num_base_bdevs_discovered": 2, 00:13:29.264 "num_base_bdevs_operational": 2, 00:13:29.264 "base_bdevs_list": [ 00:13:29.264 { 00:13:29.264 "name": "BaseBdev1", 00:13:29.264 "uuid": "2917d61d-868d-5a68-857f-aae03df95c91", 00:13:29.264 "is_configured": true, 00:13:29.264 "data_offset": 2048, 00:13:29.264 "data_size": 63488 00:13:29.264 }, 00:13:29.264 { 00:13:29.264 "name": "BaseBdev2", 00:13:29.264 "uuid": "6206a947-032d-57ec-b30e-fd842c919d63", 00:13:29.264 "is_configured": true, 00:13:29.264 "data_offset": 2048, 00:13:29.264 "data_size": 63488 00:13:29.264 } 00:13:29.264 ] 00:13:29.264 }' 00:13:29.264 22:42:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.264 22:42:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.201 22:42:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:30.201 22:42:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:30.201 [2024-07-15 22:42:15.044288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x98fc70 00:13:31.162 22:42:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.464 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:31.724 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.724 "name": "raid_bdev1", 00:13:31.724 "uuid": "41932895-ec6d-42f1-b1f1-f8b67282886c", 00:13:31.724 "strip_size_kb": 0, 00:13:31.724 "state": "online", 00:13:31.724 "raid_level": "raid1", 00:13:31.724 "superblock": true, 00:13:31.724 "num_base_bdevs": 2, 00:13:31.724 "num_base_bdevs_discovered": 2, 00:13:31.724 "num_base_bdevs_operational": 2, 00:13:31.724 "base_bdevs_list": [ 00:13:31.724 { 00:13:31.724 "name": "BaseBdev1", 00:13:31.724 "uuid": "2917d61d-868d-5a68-857f-aae03df95c91", 00:13:31.724 "is_configured": true, 00:13:31.724 "data_offset": 2048, 00:13:31.724 "data_size": 63488 00:13:31.724 }, 00:13:31.724 { 00:13:31.724 "name": "BaseBdev2", 00:13:31.724 "uuid": "6206a947-032d-57ec-b30e-fd842c919d63", 00:13:31.724 "is_configured": true, 00:13:31.724 "data_offset": 2048, 00:13:31.724 "data_size": 63488 00:13:31.724 } 00:13:31.724 ] 00:13:31.724 }' 00:13:31.724 22:42:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.724 22:42:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:32.663 [2024-07-15 22:42:17.486769] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:32.663 [2024-07-15 22:42:17.486810] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:32.663 [2024-07-15 22:42:17.489995] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:32.663 [2024-07-15 22:42:17.490026] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:32.663 [2024-07-15 22:42:17.490110] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:32.663 [2024-07-15 22:42:17.490123] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x994320 name raid_bdev1, state offline 00:13:32.663 0 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2717313 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2717313 ']' 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2717313 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2717313 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2717313' 00:13:32.663 killing process with pid 2717313 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2717313 00:13:32.663 [2024-07-15 22:42:17.571627] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:32.663 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2717313 00:13:32.922 [2024-07-15 22:42:17.582740] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.I73HXbB9ji 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:32.923 00:13:32.923 real 0m7.329s 00:13:32.923 user 0m12.280s 00:13:32.923 sys 0m1.218s 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:32.923 22:42:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.923 ************************************ 00:13:32.923 END TEST raid_read_error_test 00:13:32.923 ************************************ 00:13:33.182 22:42:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:33.182 22:42:17 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:33.182 22:42:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:33.182 22:42:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:33.182 22:42:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:33.182 ************************************ 00:13:33.182 START TEST raid_write_error_test 00:13:33.182 ************************************ 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:33.182 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.PBjaagaS07 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2718295 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2718295 /var/tmp/spdk-raid.sock 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2718295 ']' 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:33.183 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:33.183 22:42:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.183 [2024-07-15 22:42:17.978658] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:13:33.183 [2024-07-15 22:42:17.978724] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718295 ] 00:13:33.442 [2024-07-15 22:42:18.106907] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:33.442 [2024-07-15 22:42:18.216357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:33.442 [2024-07-15 22:42:18.288040] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:33.442 [2024-07-15 22:42:18.288099] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:34.010 22:42:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:34.010 22:42:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:34.010 22:42:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:34.010 22:42:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:34.269 BaseBdev1_malloc 00:13:34.269 22:42:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:34.528 true 00:13:34.528 22:42:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:34.788 [2024-07-15 22:42:19.619831] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:34.788 [2024-07-15 22:42:19.619878] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.788 [2024-07-15 22:42:19.619899] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd70d0 00:13:34.788 [2024-07-15 22:42:19.619912] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.788 [2024-07-15 22:42:19.621851] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.788 [2024-07-15 22:42:19.621880] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:34.788 BaseBdev1 00:13:34.788 22:42:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:34.788 22:42:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:35.047 BaseBdev2_malloc 00:13:35.047 22:42:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:35.307 true 00:13:35.307 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:35.566 [2024-07-15 22:42:20.351400] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:35.567 [2024-07-15 22:42:20.351445] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:35.567 [2024-07-15 22:42:20.351463] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdb910 00:13:35.567 [2024-07-15 22:42:20.351476] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:35.567 [2024-07-15 22:42:20.353070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:35.567 [2024-07-15 22:42:20.353104] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:35.567 BaseBdev2 00:13:35.567 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:35.826 [2024-07-15 22:42:20.592058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:35.826 [2024-07-15 22:42:20.593393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:35.826 [2024-07-15 22:42:20.593587] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcdd320 00:13:35.826 [2024-07-15 22:42:20.593600] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:35.826 [2024-07-15 22:42:20.593792] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb44d00 00:13:35.826 [2024-07-15 22:42:20.593950] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcdd320 00:13:35.826 [2024-07-15 22:42:20.593961] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcdd320 00:13:35.826 [2024-07-15 22:42:20.594073] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:35.826 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.086 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.086 "name": "raid_bdev1", 00:13:36.086 "uuid": "c6bd548b-a040-486e-a1aa-eee0903f1523", 00:13:36.086 "strip_size_kb": 0, 00:13:36.086 "state": "online", 00:13:36.086 "raid_level": "raid1", 00:13:36.086 "superblock": true, 00:13:36.086 "num_base_bdevs": 2, 00:13:36.086 "num_base_bdevs_discovered": 2, 00:13:36.086 "num_base_bdevs_operational": 2, 00:13:36.086 "base_bdevs_list": [ 00:13:36.086 { 00:13:36.086 "name": "BaseBdev1", 00:13:36.086 "uuid": "a3fc0f7d-d877-555b-96db-0e2b02116be7", 00:13:36.086 "is_configured": true, 00:13:36.086 "data_offset": 2048, 00:13:36.086 "data_size": 63488 00:13:36.086 }, 00:13:36.086 { 00:13:36.086 "name": "BaseBdev2", 00:13:36.086 "uuid": "4d994014-6b96-54fa-909d-64b034167399", 00:13:36.086 "is_configured": true, 00:13:36.086 "data_offset": 2048, 00:13:36.086 "data_size": 63488 00:13:36.086 } 00:13:36.086 ] 00:13:36.086 }' 00:13:36.086 22:42:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.086 22:42:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.655 22:42:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:36.655 22:42:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:36.655 [2024-07-15 22:42:21.546874] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd8c70 00:13:37.593 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:37.853 [2024-07-15 22:42:22.642084] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:37.853 [2024-07-15 22:42:22.642152] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:37.853 [2024-07-15 22:42:22.642328] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xcd8c70 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.853 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:38.112 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.112 "name": "raid_bdev1", 00:13:38.112 "uuid": "c6bd548b-a040-486e-a1aa-eee0903f1523", 00:13:38.112 "strip_size_kb": 0, 00:13:38.112 "state": "online", 00:13:38.112 "raid_level": "raid1", 00:13:38.112 "superblock": true, 00:13:38.112 "num_base_bdevs": 2, 00:13:38.112 "num_base_bdevs_discovered": 1, 00:13:38.112 "num_base_bdevs_operational": 1, 00:13:38.112 "base_bdevs_list": [ 00:13:38.112 { 00:13:38.112 "name": null, 00:13:38.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.112 "is_configured": false, 00:13:38.112 "data_offset": 2048, 00:13:38.112 "data_size": 63488 00:13:38.112 }, 00:13:38.112 { 00:13:38.112 "name": "BaseBdev2", 00:13:38.112 "uuid": "4d994014-6b96-54fa-909d-64b034167399", 00:13:38.112 "is_configured": true, 00:13:38.112 "data_offset": 2048, 00:13:38.112 "data_size": 63488 00:13:38.112 } 00:13:38.112 ] 00:13:38.112 }' 00:13:38.112 22:42:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.112 22:42:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.680 22:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:38.939 [2024-07-15 22:42:23.762326] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:38.939 [2024-07-15 22:42:23.762360] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:38.939 [2024-07-15 22:42:23.765485] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.939 [2024-07-15 22:42:23.765512] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:38.940 [2024-07-15 22:42:23.765564] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.940 [2024-07-15 22:42:23.765575] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcdd320 name raid_bdev1, state offline 00:13:38.940 0 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2718295 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2718295 ']' 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2718295 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2718295 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2718295' 00:13:38.940 killing process with pid 2718295 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2718295 00:13:38.940 [2024-07-15 22:42:23.845934] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:38.940 22:42:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2718295 00:13:39.198 [2024-07-15 22:42:23.856580] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.PBjaagaS07 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:39.198 00:13:39.198 real 0m6.189s 00:13:39.198 user 0m9.648s 00:13:39.198 sys 0m1.081s 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:39.198 22:42:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.198 ************************************ 00:13:39.198 END TEST raid_write_error_test 00:13:39.198 ************************************ 00:13:39.457 22:42:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:39.457 22:42:24 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:39.457 22:42:24 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:39.457 22:42:24 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:39.457 22:42:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:39.457 22:42:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:39.457 22:42:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:39.457 ************************************ 00:13:39.457 START TEST raid_state_function_test 00:13:39.457 ************************************ 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2719261 00:13:39.457 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2719261' 00:13:39.458 Process raid pid: 2719261 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2719261 /var/tmp/spdk-raid.sock 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2719261 ']' 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:39.458 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:39.458 22:42:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.458 [2024-07-15 22:42:24.248009] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:13:39.458 [2024-07-15 22:42:24.248064] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:39.458 [2024-07-15 22:42:24.352657] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:39.716 [2024-07-15 22:42:24.459757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:39.716 [2024-07-15 22:42:24.521621] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:39.716 [2024-07-15 22:42:24.521655] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:40.652 [2024-07-15 22:42:25.444452] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:40.652 [2024-07-15 22:42:25.444496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:40.652 [2024-07-15 22:42:25.444507] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:40.652 [2024-07-15 22:42:25.444523] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:40.652 [2024-07-15 22:42:25.444532] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:40.652 [2024-07-15 22:42:25.444543] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.652 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.910 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.910 "name": "Existed_Raid", 00:13:40.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.910 "strip_size_kb": 64, 00:13:40.910 "state": "configuring", 00:13:40.910 "raid_level": "raid0", 00:13:40.910 "superblock": false, 00:13:40.910 "num_base_bdevs": 3, 00:13:40.910 "num_base_bdevs_discovered": 0, 00:13:40.910 "num_base_bdevs_operational": 3, 00:13:40.910 "base_bdevs_list": [ 00:13:40.910 { 00:13:40.910 "name": "BaseBdev1", 00:13:40.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.910 "is_configured": false, 00:13:40.910 "data_offset": 0, 00:13:40.910 "data_size": 0 00:13:40.910 }, 00:13:40.910 { 00:13:40.910 "name": "BaseBdev2", 00:13:40.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.910 "is_configured": false, 00:13:40.910 "data_offset": 0, 00:13:40.910 "data_size": 0 00:13:40.910 }, 00:13:40.910 { 00:13:40.910 "name": "BaseBdev3", 00:13:40.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:40.910 "is_configured": false, 00:13:40.910 "data_offset": 0, 00:13:40.910 "data_size": 0 00:13:40.910 } 00:13:40.910 ] 00:13:40.910 }' 00:13:40.910 22:42:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.910 22:42:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.477 22:42:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:41.735 [2024-07-15 22:42:26.483073] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:41.735 [2024-07-15 22:42:26.483104] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0da80 name Existed_Raid, state configuring 00:13:41.735 22:42:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:41.993 [2024-07-15 22:42:26.731744] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:41.993 [2024-07-15 22:42:26.731771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:41.993 [2024-07-15 22:42:26.731781] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:41.993 [2024-07-15 22:42:26.731792] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:41.993 [2024-07-15 22:42:26.731801] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:41.993 [2024-07-15 22:42:26.731820] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:41.993 22:42:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:42.252 [2024-07-15 22:42:26.986161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:42.252 BaseBdev1 00:13:42.252 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:42.252 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:42.252 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:42.252 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:42.252 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:42.252 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:42.252 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:42.510 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:42.769 [ 00:13:42.769 { 00:13:42.769 "name": "BaseBdev1", 00:13:42.769 "aliases": [ 00:13:42.769 "466a3148-dafd-4006-ba86-9046bdbf6a04" 00:13:42.769 ], 00:13:42.769 "product_name": "Malloc disk", 00:13:42.769 "block_size": 512, 00:13:42.769 "num_blocks": 65536, 00:13:42.769 "uuid": "466a3148-dafd-4006-ba86-9046bdbf6a04", 00:13:42.769 "assigned_rate_limits": { 00:13:42.769 "rw_ios_per_sec": 0, 00:13:42.769 "rw_mbytes_per_sec": 0, 00:13:42.769 "r_mbytes_per_sec": 0, 00:13:42.769 "w_mbytes_per_sec": 0 00:13:42.769 }, 00:13:42.769 "claimed": true, 00:13:42.769 "claim_type": "exclusive_write", 00:13:42.769 "zoned": false, 00:13:42.769 "supported_io_types": { 00:13:42.769 "read": true, 00:13:42.769 "write": true, 00:13:42.769 "unmap": true, 00:13:42.769 "flush": true, 00:13:42.769 "reset": true, 00:13:42.769 "nvme_admin": false, 00:13:42.769 "nvme_io": false, 00:13:42.769 "nvme_io_md": false, 00:13:42.769 "write_zeroes": true, 00:13:42.769 "zcopy": true, 00:13:42.769 "get_zone_info": false, 00:13:42.769 "zone_management": false, 00:13:42.769 "zone_append": false, 00:13:42.769 "compare": false, 00:13:42.769 "compare_and_write": false, 00:13:42.769 "abort": true, 00:13:42.769 "seek_hole": false, 00:13:42.769 "seek_data": false, 00:13:42.769 "copy": true, 00:13:42.769 "nvme_iov_md": false 00:13:42.769 }, 00:13:42.769 "memory_domains": [ 00:13:42.769 { 00:13:42.769 "dma_device_id": "system", 00:13:42.769 "dma_device_type": 1 00:13:42.769 }, 00:13:42.769 { 00:13:42.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.769 "dma_device_type": 2 00:13:42.769 } 00:13:42.769 ], 00:13:42.769 "driver_specific": {} 00:13:42.769 } 00:13:42.769 ] 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.769 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:43.028 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.028 "name": "Existed_Raid", 00:13:43.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.028 "strip_size_kb": 64, 00:13:43.028 "state": "configuring", 00:13:43.028 "raid_level": "raid0", 00:13:43.028 "superblock": false, 00:13:43.028 "num_base_bdevs": 3, 00:13:43.028 "num_base_bdevs_discovered": 1, 00:13:43.028 "num_base_bdevs_operational": 3, 00:13:43.028 "base_bdevs_list": [ 00:13:43.028 { 00:13:43.028 "name": "BaseBdev1", 00:13:43.028 "uuid": "466a3148-dafd-4006-ba86-9046bdbf6a04", 00:13:43.028 "is_configured": true, 00:13:43.028 "data_offset": 0, 00:13:43.028 "data_size": 65536 00:13:43.028 }, 00:13:43.028 { 00:13:43.028 "name": "BaseBdev2", 00:13:43.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.028 "is_configured": false, 00:13:43.028 "data_offset": 0, 00:13:43.028 "data_size": 0 00:13:43.028 }, 00:13:43.028 { 00:13:43.028 "name": "BaseBdev3", 00:13:43.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.028 "is_configured": false, 00:13:43.028 "data_offset": 0, 00:13:43.028 "data_size": 0 00:13:43.028 } 00:13:43.028 ] 00:13:43.028 }' 00:13:43.028 22:42:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.028 22:42:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.595 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:43.853 [2024-07-15 22:42:28.550336] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:43.853 [2024-07-15 22:42:28.550377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0d310 name Existed_Raid, state configuring 00:13:43.853 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:44.111 [2024-07-15 22:42:28.803036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:44.111 [2024-07-15 22:42:28.804467] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:44.111 [2024-07-15 22:42:28.804499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:44.111 [2024-07-15 22:42:28.804509] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:44.111 [2024-07-15 22:42:28.804521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.111 22:42:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.371 22:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.371 "name": "Existed_Raid", 00:13:44.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.371 "strip_size_kb": 64, 00:13:44.371 "state": "configuring", 00:13:44.371 "raid_level": "raid0", 00:13:44.371 "superblock": false, 00:13:44.371 "num_base_bdevs": 3, 00:13:44.371 "num_base_bdevs_discovered": 1, 00:13:44.371 "num_base_bdevs_operational": 3, 00:13:44.371 "base_bdevs_list": [ 00:13:44.371 { 00:13:44.371 "name": "BaseBdev1", 00:13:44.371 "uuid": "466a3148-dafd-4006-ba86-9046bdbf6a04", 00:13:44.371 "is_configured": true, 00:13:44.371 "data_offset": 0, 00:13:44.371 "data_size": 65536 00:13:44.371 }, 00:13:44.371 { 00:13:44.371 "name": "BaseBdev2", 00:13:44.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.371 "is_configured": false, 00:13:44.371 "data_offset": 0, 00:13:44.371 "data_size": 0 00:13:44.371 }, 00:13:44.371 { 00:13:44.371 "name": "BaseBdev3", 00:13:44.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.371 "is_configured": false, 00:13:44.371 "data_offset": 0, 00:13:44.371 "data_size": 0 00:13:44.371 } 00:13:44.371 ] 00:13:44.371 }' 00:13:44.371 22:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.371 22:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.938 22:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:45.197 [2024-07-15 22:42:29.901328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:45.197 BaseBdev2 00:13:45.197 22:42:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:45.197 22:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:45.197 22:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:45.197 22:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:45.197 22:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:45.197 22:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:45.197 22:42:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:45.457 22:42:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:45.718 [ 00:13:45.718 { 00:13:45.718 "name": "BaseBdev2", 00:13:45.718 "aliases": [ 00:13:45.718 "dd020d2c-d59c-4f9f-84cf-33ab02302b35" 00:13:45.718 ], 00:13:45.718 "product_name": "Malloc disk", 00:13:45.718 "block_size": 512, 00:13:45.718 "num_blocks": 65536, 00:13:45.718 "uuid": "dd020d2c-d59c-4f9f-84cf-33ab02302b35", 00:13:45.718 "assigned_rate_limits": { 00:13:45.718 "rw_ios_per_sec": 0, 00:13:45.718 "rw_mbytes_per_sec": 0, 00:13:45.718 "r_mbytes_per_sec": 0, 00:13:45.718 "w_mbytes_per_sec": 0 00:13:45.718 }, 00:13:45.718 "claimed": true, 00:13:45.718 "claim_type": "exclusive_write", 00:13:45.718 "zoned": false, 00:13:45.718 "supported_io_types": { 00:13:45.718 "read": true, 00:13:45.718 "write": true, 00:13:45.718 "unmap": true, 00:13:45.718 "flush": true, 00:13:45.718 "reset": true, 00:13:45.718 "nvme_admin": false, 00:13:45.718 "nvme_io": false, 00:13:45.718 "nvme_io_md": false, 00:13:45.718 "write_zeroes": true, 00:13:45.718 "zcopy": true, 00:13:45.718 "get_zone_info": false, 00:13:45.718 "zone_management": false, 00:13:45.718 "zone_append": false, 00:13:45.718 "compare": false, 00:13:45.718 "compare_and_write": false, 00:13:45.718 "abort": true, 00:13:45.718 "seek_hole": false, 00:13:45.718 "seek_data": false, 00:13:45.718 "copy": true, 00:13:45.718 "nvme_iov_md": false 00:13:45.718 }, 00:13:45.718 "memory_domains": [ 00:13:45.718 { 00:13:45.718 "dma_device_id": "system", 00:13:45.718 "dma_device_type": 1 00:13:45.718 }, 00:13:45.718 { 00:13:45.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:45.718 "dma_device_type": 2 00:13:45.718 } 00:13:45.718 ], 00:13:45.718 "driver_specific": {} 00:13:45.718 } 00:13:45.718 ] 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.718 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:46.319 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:46.319 "name": "Existed_Raid", 00:13:46.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.319 "strip_size_kb": 64, 00:13:46.319 "state": "configuring", 00:13:46.319 "raid_level": "raid0", 00:13:46.319 "superblock": false, 00:13:46.319 "num_base_bdevs": 3, 00:13:46.319 "num_base_bdevs_discovered": 2, 00:13:46.319 "num_base_bdevs_operational": 3, 00:13:46.319 "base_bdevs_list": [ 00:13:46.319 { 00:13:46.319 "name": "BaseBdev1", 00:13:46.319 "uuid": "466a3148-dafd-4006-ba86-9046bdbf6a04", 00:13:46.319 "is_configured": true, 00:13:46.319 "data_offset": 0, 00:13:46.319 "data_size": 65536 00:13:46.319 }, 00:13:46.319 { 00:13:46.319 "name": "BaseBdev2", 00:13:46.319 "uuid": "dd020d2c-d59c-4f9f-84cf-33ab02302b35", 00:13:46.319 "is_configured": true, 00:13:46.319 "data_offset": 0, 00:13:46.319 "data_size": 65536 00:13:46.319 }, 00:13:46.319 { 00:13:46.319 "name": "BaseBdev3", 00:13:46.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.319 "is_configured": false, 00:13:46.319 "data_offset": 0, 00:13:46.319 "data_size": 0 00:13:46.319 } 00:13:46.319 ] 00:13:46.319 }' 00:13:46.319 22:42:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:46.319 22:42:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.888 22:42:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:46.888 [2024-07-15 22:42:31.777796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:46.888 [2024-07-15 22:42:31.777833] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f0e400 00:13:46.888 [2024-07-15 22:42:31.777842] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:46.888 [2024-07-15 22:42:31.778098] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f0def0 00:13:46.888 [2024-07-15 22:42:31.778216] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f0e400 00:13:46.888 [2024-07-15 22:42:31.778227] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f0e400 00:13:46.888 [2024-07-15 22:42:31.778391] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:46.888 BaseBdev3 00:13:47.147 22:42:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:47.147 22:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:47.147 22:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:47.147 22:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:47.147 22:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:47.147 22:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:47.147 22:42:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:47.147 22:42:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:47.406 [ 00:13:47.406 { 00:13:47.406 "name": "BaseBdev3", 00:13:47.406 "aliases": [ 00:13:47.406 "6c5b7d81-228f-4c4e-b179-094ea09d2839" 00:13:47.406 ], 00:13:47.406 "product_name": "Malloc disk", 00:13:47.406 "block_size": 512, 00:13:47.406 "num_blocks": 65536, 00:13:47.406 "uuid": "6c5b7d81-228f-4c4e-b179-094ea09d2839", 00:13:47.406 "assigned_rate_limits": { 00:13:47.406 "rw_ios_per_sec": 0, 00:13:47.406 "rw_mbytes_per_sec": 0, 00:13:47.406 "r_mbytes_per_sec": 0, 00:13:47.406 "w_mbytes_per_sec": 0 00:13:47.406 }, 00:13:47.406 "claimed": true, 00:13:47.406 "claim_type": "exclusive_write", 00:13:47.406 "zoned": false, 00:13:47.406 "supported_io_types": { 00:13:47.406 "read": true, 00:13:47.406 "write": true, 00:13:47.406 "unmap": true, 00:13:47.406 "flush": true, 00:13:47.406 "reset": true, 00:13:47.406 "nvme_admin": false, 00:13:47.406 "nvme_io": false, 00:13:47.406 "nvme_io_md": false, 00:13:47.406 "write_zeroes": true, 00:13:47.406 "zcopy": true, 00:13:47.406 "get_zone_info": false, 00:13:47.406 "zone_management": false, 00:13:47.406 "zone_append": false, 00:13:47.406 "compare": false, 00:13:47.406 "compare_and_write": false, 00:13:47.406 "abort": true, 00:13:47.406 "seek_hole": false, 00:13:47.406 "seek_data": false, 00:13:47.406 "copy": true, 00:13:47.406 "nvme_iov_md": false 00:13:47.406 }, 00:13:47.406 "memory_domains": [ 00:13:47.406 { 00:13:47.406 "dma_device_id": "system", 00:13:47.406 "dma_device_type": 1 00:13:47.406 }, 00:13:47.406 { 00:13:47.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.406 "dma_device_type": 2 00:13:47.406 } 00:13:47.406 ], 00:13:47.406 "driver_specific": {} 00:13:47.406 } 00:13:47.406 ] 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.406 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.666 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.666 "name": "Existed_Raid", 00:13:47.666 "uuid": "f8d34f00-3c4f-4f9b-8580-c4cbd6284267", 00:13:47.666 "strip_size_kb": 64, 00:13:47.666 "state": "online", 00:13:47.666 "raid_level": "raid0", 00:13:47.666 "superblock": false, 00:13:47.666 "num_base_bdevs": 3, 00:13:47.666 "num_base_bdevs_discovered": 3, 00:13:47.666 "num_base_bdevs_operational": 3, 00:13:47.666 "base_bdevs_list": [ 00:13:47.666 { 00:13:47.666 "name": "BaseBdev1", 00:13:47.666 "uuid": "466a3148-dafd-4006-ba86-9046bdbf6a04", 00:13:47.666 "is_configured": true, 00:13:47.666 "data_offset": 0, 00:13:47.666 "data_size": 65536 00:13:47.666 }, 00:13:47.666 { 00:13:47.666 "name": "BaseBdev2", 00:13:47.666 "uuid": "dd020d2c-d59c-4f9f-84cf-33ab02302b35", 00:13:47.666 "is_configured": true, 00:13:47.666 "data_offset": 0, 00:13:47.666 "data_size": 65536 00:13:47.666 }, 00:13:47.666 { 00:13:47.666 "name": "BaseBdev3", 00:13:47.666 "uuid": "6c5b7d81-228f-4c4e-b179-094ea09d2839", 00:13:47.666 "is_configured": true, 00:13:47.666 "data_offset": 0, 00:13:47.666 "data_size": 65536 00:13:47.666 } 00:13:47.666 ] 00:13:47.666 }' 00:13:47.666 22:42:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.666 22:42:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:48.604 [2024-07-15 22:42:33.378350] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:48.604 "name": "Existed_Raid", 00:13:48.604 "aliases": [ 00:13:48.604 "f8d34f00-3c4f-4f9b-8580-c4cbd6284267" 00:13:48.604 ], 00:13:48.604 "product_name": "Raid Volume", 00:13:48.604 "block_size": 512, 00:13:48.604 "num_blocks": 196608, 00:13:48.604 "uuid": "f8d34f00-3c4f-4f9b-8580-c4cbd6284267", 00:13:48.604 "assigned_rate_limits": { 00:13:48.604 "rw_ios_per_sec": 0, 00:13:48.604 "rw_mbytes_per_sec": 0, 00:13:48.604 "r_mbytes_per_sec": 0, 00:13:48.604 "w_mbytes_per_sec": 0 00:13:48.604 }, 00:13:48.604 "claimed": false, 00:13:48.604 "zoned": false, 00:13:48.604 "supported_io_types": { 00:13:48.604 "read": true, 00:13:48.604 "write": true, 00:13:48.604 "unmap": true, 00:13:48.604 "flush": true, 00:13:48.604 "reset": true, 00:13:48.604 "nvme_admin": false, 00:13:48.604 "nvme_io": false, 00:13:48.604 "nvme_io_md": false, 00:13:48.604 "write_zeroes": true, 00:13:48.604 "zcopy": false, 00:13:48.604 "get_zone_info": false, 00:13:48.604 "zone_management": false, 00:13:48.604 "zone_append": false, 00:13:48.604 "compare": false, 00:13:48.604 "compare_and_write": false, 00:13:48.604 "abort": false, 00:13:48.604 "seek_hole": false, 00:13:48.604 "seek_data": false, 00:13:48.604 "copy": false, 00:13:48.604 "nvme_iov_md": false 00:13:48.604 }, 00:13:48.604 "memory_domains": [ 00:13:48.604 { 00:13:48.604 "dma_device_id": "system", 00:13:48.604 "dma_device_type": 1 00:13:48.604 }, 00:13:48.604 { 00:13:48.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.604 "dma_device_type": 2 00:13:48.604 }, 00:13:48.604 { 00:13:48.604 "dma_device_id": "system", 00:13:48.604 "dma_device_type": 1 00:13:48.604 }, 00:13:48.604 { 00:13:48.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.604 "dma_device_type": 2 00:13:48.604 }, 00:13:48.604 { 00:13:48.604 "dma_device_id": "system", 00:13:48.604 "dma_device_type": 1 00:13:48.604 }, 00:13:48.604 { 00:13:48.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.604 "dma_device_type": 2 00:13:48.604 } 00:13:48.604 ], 00:13:48.604 "driver_specific": { 00:13:48.604 "raid": { 00:13:48.604 "uuid": "f8d34f00-3c4f-4f9b-8580-c4cbd6284267", 00:13:48.604 "strip_size_kb": 64, 00:13:48.604 "state": "online", 00:13:48.604 "raid_level": "raid0", 00:13:48.604 "superblock": false, 00:13:48.604 "num_base_bdevs": 3, 00:13:48.604 "num_base_bdevs_discovered": 3, 00:13:48.604 "num_base_bdevs_operational": 3, 00:13:48.604 "base_bdevs_list": [ 00:13:48.604 { 00:13:48.604 "name": "BaseBdev1", 00:13:48.604 "uuid": "466a3148-dafd-4006-ba86-9046bdbf6a04", 00:13:48.604 "is_configured": true, 00:13:48.604 "data_offset": 0, 00:13:48.604 "data_size": 65536 00:13:48.604 }, 00:13:48.604 { 00:13:48.604 "name": "BaseBdev2", 00:13:48.604 "uuid": "dd020d2c-d59c-4f9f-84cf-33ab02302b35", 00:13:48.604 "is_configured": true, 00:13:48.604 "data_offset": 0, 00:13:48.604 "data_size": 65536 00:13:48.604 }, 00:13:48.604 { 00:13:48.604 "name": "BaseBdev3", 00:13:48.604 "uuid": "6c5b7d81-228f-4c4e-b179-094ea09d2839", 00:13:48.604 "is_configured": true, 00:13:48.604 "data_offset": 0, 00:13:48.604 "data_size": 65536 00:13:48.604 } 00:13:48.604 ] 00:13:48.604 } 00:13:48.604 } 00:13:48.604 }' 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:48.604 BaseBdev2 00:13:48.604 BaseBdev3' 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:48.604 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:48.863 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:48.863 "name": "BaseBdev1", 00:13:48.863 "aliases": [ 00:13:48.863 "466a3148-dafd-4006-ba86-9046bdbf6a04" 00:13:48.863 ], 00:13:48.863 "product_name": "Malloc disk", 00:13:48.863 "block_size": 512, 00:13:48.863 "num_blocks": 65536, 00:13:48.863 "uuid": "466a3148-dafd-4006-ba86-9046bdbf6a04", 00:13:48.863 "assigned_rate_limits": { 00:13:48.863 "rw_ios_per_sec": 0, 00:13:48.863 "rw_mbytes_per_sec": 0, 00:13:48.863 "r_mbytes_per_sec": 0, 00:13:48.863 "w_mbytes_per_sec": 0 00:13:48.863 }, 00:13:48.863 "claimed": true, 00:13:48.863 "claim_type": "exclusive_write", 00:13:48.863 "zoned": false, 00:13:48.863 "supported_io_types": { 00:13:48.863 "read": true, 00:13:48.863 "write": true, 00:13:48.863 "unmap": true, 00:13:48.863 "flush": true, 00:13:48.863 "reset": true, 00:13:48.863 "nvme_admin": false, 00:13:48.863 "nvme_io": false, 00:13:48.863 "nvme_io_md": false, 00:13:48.863 "write_zeroes": true, 00:13:48.863 "zcopy": true, 00:13:48.863 "get_zone_info": false, 00:13:48.863 "zone_management": false, 00:13:48.863 "zone_append": false, 00:13:48.863 "compare": false, 00:13:48.863 "compare_and_write": false, 00:13:48.863 "abort": true, 00:13:48.863 "seek_hole": false, 00:13:48.863 "seek_data": false, 00:13:48.863 "copy": true, 00:13:48.863 "nvme_iov_md": false 00:13:48.863 }, 00:13:48.863 "memory_domains": [ 00:13:48.863 { 00:13:48.863 "dma_device_id": "system", 00:13:48.863 "dma_device_type": 1 00:13:48.863 }, 00:13:48.863 { 00:13:48.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.863 "dma_device_type": 2 00:13:48.863 } 00:13:48.863 ], 00:13:48.863 "driver_specific": {} 00:13:48.863 }' 00:13:48.863 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.863 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.863 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.863 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:49.122 22:42:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:49.122 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.382 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.382 "name": "BaseBdev2", 00:13:49.382 "aliases": [ 00:13:49.382 "dd020d2c-d59c-4f9f-84cf-33ab02302b35" 00:13:49.382 ], 00:13:49.382 "product_name": "Malloc disk", 00:13:49.382 "block_size": 512, 00:13:49.382 "num_blocks": 65536, 00:13:49.382 "uuid": "dd020d2c-d59c-4f9f-84cf-33ab02302b35", 00:13:49.382 "assigned_rate_limits": { 00:13:49.382 "rw_ios_per_sec": 0, 00:13:49.382 "rw_mbytes_per_sec": 0, 00:13:49.382 "r_mbytes_per_sec": 0, 00:13:49.382 "w_mbytes_per_sec": 0 00:13:49.382 }, 00:13:49.382 "claimed": true, 00:13:49.382 "claim_type": "exclusive_write", 00:13:49.382 "zoned": false, 00:13:49.382 "supported_io_types": { 00:13:49.382 "read": true, 00:13:49.382 "write": true, 00:13:49.382 "unmap": true, 00:13:49.382 "flush": true, 00:13:49.382 "reset": true, 00:13:49.382 "nvme_admin": false, 00:13:49.382 "nvme_io": false, 00:13:49.382 "nvme_io_md": false, 00:13:49.382 "write_zeroes": true, 00:13:49.382 "zcopy": true, 00:13:49.382 "get_zone_info": false, 00:13:49.382 "zone_management": false, 00:13:49.382 "zone_append": false, 00:13:49.382 "compare": false, 00:13:49.382 "compare_and_write": false, 00:13:49.382 "abort": true, 00:13:49.382 "seek_hole": false, 00:13:49.382 "seek_data": false, 00:13:49.382 "copy": true, 00:13:49.382 "nvme_iov_md": false 00:13:49.382 }, 00:13:49.382 "memory_domains": [ 00:13:49.382 { 00:13:49.382 "dma_device_id": "system", 00:13:49.382 "dma_device_type": 1 00:13:49.382 }, 00:13:49.382 { 00:13:49.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.382 "dma_device_type": 2 00:13:49.382 } 00:13:49.382 ], 00:13:49.382 "driver_specific": {} 00:13:49.382 }' 00:13:49.382 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.641 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.900 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.900 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.900 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:49.900 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:49.900 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.900 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.900 "name": "BaseBdev3", 00:13:49.900 "aliases": [ 00:13:49.900 "6c5b7d81-228f-4c4e-b179-094ea09d2839" 00:13:49.900 ], 00:13:49.900 "product_name": "Malloc disk", 00:13:49.900 "block_size": 512, 00:13:49.900 "num_blocks": 65536, 00:13:49.900 "uuid": "6c5b7d81-228f-4c4e-b179-094ea09d2839", 00:13:49.900 "assigned_rate_limits": { 00:13:49.900 "rw_ios_per_sec": 0, 00:13:49.900 "rw_mbytes_per_sec": 0, 00:13:49.900 "r_mbytes_per_sec": 0, 00:13:49.900 "w_mbytes_per_sec": 0 00:13:49.900 }, 00:13:49.900 "claimed": true, 00:13:49.900 "claim_type": "exclusive_write", 00:13:49.900 "zoned": false, 00:13:49.900 "supported_io_types": { 00:13:49.900 "read": true, 00:13:49.900 "write": true, 00:13:49.900 "unmap": true, 00:13:49.900 "flush": true, 00:13:49.900 "reset": true, 00:13:49.900 "nvme_admin": false, 00:13:49.900 "nvme_io": false, 00:13:49.900 "nvme_io_md": false, 00:13:49.900 "write_zeroes": true, 00:13:49.900 "zcopy": true, 00:13:49.900 "get_zone_info": false, 00:13:49.900 "zone_management": false, 00:13:49.900 "zone_append": false, 00:13:49.900 "compare": false, 00:13:49.900 "compare_and_write": false, 00:13:49.900 "abort": true, 00:13:49.900 "seek_hole": false, 00:13:49.900 "seek_data": false, 00:13:49.900 "copy": true, 00:13:49.900 "nvme_iov_md": false 00:13:49.900 }, 00:13:49.900 "memory_domains": [ 00:13:49.900 { 00:13:49.900 "dma_device_id": "system", 00:13:49.900 "dma_device_type": 1 00:13:49.900 }, 00:13:49.900 { 00:13:49.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.900 "dma_device_type": 2 00:13:49.900 } 00:13:49.900 ], 00:13:49.900 "driver_specific": {} 00:13:49.900 }' 00:13:49.900 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:50.157 22:42:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.157 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.415 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.415 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:50.415 [2024-07-15 22:42:35.299194] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:50.415 [2024-07-15 22:42:35.299221] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:50.415 [2024-07-15 22:42:35.299263] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.415 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:50.415 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.673 "name": "Existed_Raid", 00:13:50.673 "uuid": "f8d34f00-3c4f-4f9b-8580-c4cbd6284267", 00:13:50.673 "strip_size_kb": 64, 00:13:50.673 "state": "offline", 00:13:50.673 "raid_level": "raid0", 00:13:50.673 "superblock": false, 00:13:50.673 "num_base_bdevs": 3, 00:13:50.673 "num_base_bdevs_discovered": 2, 00:13:50.673 "num_base_bdevs_operational": 2, 00:13:50.673 "base_bdevs_list": [ 00:13:50.673 { 00:13:50.673 "name": null, 00:13:50.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.673 "is_configured": false, 00:13:50.673 "data_offset": 0, 00:13:50.673 "data_size": 65536 00:13:50.673 }, 00:13:50.673 { 00:13:50.673 "name": "BaseBdev2", 00:13:50.673 "uuid": "dd020d2c-d59c-4f9f-84cf-33ab02302b35", 00:13:50.673 "is_configured": true, 00:13:50.673 "data_offset": 0, 00:13:50.673 "data_size": 65536 00:13:50.673 }, 00:13:50.673 { 00:13:50.673 "name": "BaseBdev3", 00:13:50.673 "uuid": "6c5b7d81-228f-4c4e-b179-094ea09d2839", 00:13:50.673 "is_configured": true, 00:13:50.673 "data_offset": 0, 00:13:50.673 "data_size": 65536 00:13:50.673 } 00:13:50.673 ] 00:13:50.673 }' 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.673 22:42:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.608 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:51.608 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:51.608 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.608 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:51.608 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:51.608 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:51.608 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:51.867 [2024-07-15 22:42:36.647789] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:51.867 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:51.867 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:51.867 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.867 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:52.126 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:52.126 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:52.126 22:42:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:52.385 [2024-07-15 22:42:37.159763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:52.385 [2024-07-15 22:42:37.159814] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0e400 name Existed_Raid, state offline 00:13:52.385 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:52.385 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:52.385 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.385 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:52.643 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:52.643 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:52.643 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:52.643 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:52.643 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:52.643 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:52.902 BaseBdev2 00:13:52.902 22:42:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:52.902 22:42:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:52.902 22:42:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:52.902 22:42:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:52.902 22:42:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:52.902 22:42:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:52.902 22:42:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.161 22:42:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:53.420 [ 00:13:53.420 { 00:13:53.420 "name": "BaseBdev2", 00:13:53.420 "aliases": [ 00:13:53.420 "bad9c462-8885-492d-bafb-1db5ddeb6b73" 00:13:53.420 ], 00:13:53.420 "product_name": "Malloc disk", 00:13:53.420 "block_size": 512, 00:13:53.420 "num_blocks": 65536, 00:13:53.420 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:13:53.420 "assigned_rate_limits": { 00:13:53.420 "rw_ios_per_sec": 0, 00:13:53.420 "rw_mbytes_per_sec": 0, 00:13:53.420 "r_mbytes_per_sec": 0, 00:13:53.420 "w_mbytes_per_sec": 0 00:13:53.420 }, 00:13:53.420 "claimed": false, 00:13:53.420 "zoned": false, 00:13:53.420 "supported_io_types": { 00:13:53.420 "read": true, 00:13:53.420 "write": true, 00:13:53.420 "unmap": true, 00:13:53.420 "flush": true, 00:13:53.420 "reset": true, 00:13:53.420 "nvme_admin": false, 00:13:53.420 "nvme_io": false, 00:13:53.420 "nvme_io_md": false, 00:13:53.420 "write_zeroes": true, 00:13:53.420 "zcopy": true, 00:13:53.420 "get_zone_info": false, 00:13:53.420 "zone_management": false, 00:13:53.420 "zone_append": false, 00:13:53.420 "compare": false, 00:13:53.420 "compare_and_write": false, 00:13:53.420 "abort": true, 00:13:53.420 "seek_hole": false, 00:13:53.420 "seek_data": false, 00:13:53.420 "copy": true, 00:13:53.420 "nvme_iov_md": false 00:13:53.420 }, 00:13:53.420 "memory_domains": [ 00:13:53.420 { 00:13:53.420 "dma_device_id": "system", 00:13:53.420 "dma_device_type": 1 00:13:53.420 }, 00:13:53.420 { 00:13:53.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.420 "dma_device_type": 2 00:13:53.420 } 00:13:53.420 ], 00:13:53.420 "driver_specific": {} 00:13:53.420 } 00:13:53.420 ] 00:13:53.420 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:53.420 22:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:53.420 22:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:53.420 22:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:53.679 BaseBdev3 00:13:53.679 22:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:53.679 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:13:53.679 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:53.679 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:53.679 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:53.679 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:53.679 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.945 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:54.204 [ 00:13:54.204 { 00:13:54.204 "name": "BaseBdev3", 00:13:54.204 "aliases": [ 00:13:54.204 "f1ed75e4-0719-4112-a5f8-f8f640a73afb" 00:13:54.204 ], 00:13:54.204 "product_name": "Malloc disk", 00:13:54.204 "block_size": 512, 00:13:54.204 "num_blocks": 65536, 00:13:54.204 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:13:54.204 "assigned_rate_limits": { 00:13:54.204 "rw_ios_per_sec": 0, 00:13:54.204 "rw_mbytes_per_sec": 0, 00:13:54.204 "r_mbytes_per_sec": 0, 00:13:54.204 "w_mbytes_per_sec": 0 00:13:54.204 }, 00:13:54.204 "claimed": false, 00:13:54.204 "zoned": false, 00:13:54.204 "supported_io_types": { 00:13:54.204 "read": true, 00:13:54.204 "write": true, 00:13:54.204 "unmap": true, 00:13:54.204 "flush": true, 00:13:54.204 "reset": true, 00:13:54.204 "nvme_admin": false, 00:13:54.204 "nvme_io": false, 00:13:54.204 "nvme_io_md": false, 00:13:54.204 "write_zeroes": true, 00:13:54.204 "zcopy": true, 00:13:54.204 "get_zone_info": false, 00:13:54.204 "zone_management": false, 00:13:54.204 "zone_append": false, 00:13:54.204 "compare": false, 00:13:54.204 "compare_and_write": false, 00:13:54.204 "abort": true, 00:13:54.204 "seek_hole": false, 00:13:54.204 "seek_data": false, 00:13:54.204 "copy": true, 00:13:54.204 "nvme_iov_md": false 00:13:54.204 }, 00:13:54.204 "memory_domains": [ 00:13:54.204 { 00:13:54.204 "dma_device_id": "system", 00:13:54.204 "dma_device_type": 1 00:13:54.204 }, 00:13:54.204 { 00:13:54.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.204 "dma_device_type": 2 00:13:54.204 } 00:13:54.204 ], 00:13:54.204 "driver_specific": {} 00:13:54.204 } 00:13:54.204 ] 00:13:54.204 22:42:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:54.204 22:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:54.204 22:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:54.204 22:42:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:54.462 [2024-07-15 22:42:39.134168] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:54.462 [2024-07-15 22:42:39.134213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:54.462 [2024-07-15 22:42:39.134233] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:54.462 [2024-07-15 22:42:39.135602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.462 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.721 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.721 "name": "Existed_Raid", 00:13:54.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.721 "strip_size_kb": 64, 00:13:54.721 "state": "configuring", 00:13:54.721 "raid_level": "raid0", 00:13:54.721 "superblock": false, 00:13:54.721 "num_base_bdevs": 3, 00:13:54.721 "num_base_bdevs_discovered": 2, 00:13:54.721 "num_base_bdevs_operational": 3, 00:13:54.721 "base_bdevs_list": [ 00:13:54.721 { 00:13:54.721 "name": "BaseBdev1", 00:13:54.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.721 "is_configured": false, 00:13:54.721 "data_offset": 0, 00:13:54.721 "data_size": 0 00:13:54.721 }, 00:13:54.721 { 00:13:54.721 "name": "BaseBdev2", 00:13:54.721 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:13:54.721 "is_configured": true, 00:13:54.721 "data_offset": 0, 00:13:54.721 "data_size": 65536 00:13:54.721 }, 00:13:54.721 { 00:13:54.721 "name": "BaseBdev3", 00:13:54.721 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:13:54.721 "is_configured": true, 00:13:54.721 "data_offset": 0, 00:13:54.721 "data_size": 65536 00:13:54.721 } 00:13:54.721 ] 00:13:54.721 }' 00:13:54.721 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.721 22:42:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.286 22:42:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:55.544 [2024-07-15 22:42:40.209014] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.544 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.545 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.545 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.803 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.803 "name": "Existed_Raid", 00:13:55.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.803 "strip_size_kb": 64, 00:13:55.803 "state": "configuring", 00:13:55.803 "raid_level": "raid0", 00:13:55.803 "superblock": false, 00:13:55.803 "num_base_bdevs": 3, 00:13:55.803 "num_base_bdevs_discovered": 1, 00:13:55.803 "num_base_bdevs_operational": 3, 00:13:55.803 "base_bdevs_list": [ 00:13:55.803 { 00:13:55.803 "name": "BaseBdev1", 00:13:55.803 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.803 "is_configured": false, 00:13:55.803 "data_offset": 0, 00:13:55.803 "data_size": 0 00:13:55.803 }, 00:13:55.803 { 00:13:55.803 "name": null, 00:13:55.803 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:13:55.803 "is_configured": false, 00:13:55.803 "data_offset": 0, 00:13:55.803 "data_size": 65536 00:13:55.803 }, 00:13:55.803 { 00:13:55.803 "name": "BaseBdev3", 00:13:55.803 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:13:55.803 "is_configured": true, 00:13:55.803 "data_offset": 0, 00:13:55.803 "data_size": 65536 00:13:55.803 } 00:13:55.803 ] 00:13:55.803 }' 00:13:55.803 22:42:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.803 22:42:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.371 22:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.371 22:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:56.629 22:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:56.629 22:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:57.195 [2024-07-15 22:42:41.804600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:57.195 BaseBdev1 00:13:57.195 22:42:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:57.195 22:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:57.195 22:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:57.195 22:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:57.195 22:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:57.195 22:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:57.195 22:42:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:57.195 22:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:57.453 [ 00:13:57.453 { 00:13:57.453 "name": "BaseBdev1", 00:13:57.453 "aliases": [ 00:13:57.453 "53fbc8ec-c16d-41c2-b184-34864dd18ca2" 00:13:57.453 ], 00:13:57.453 "product_name": "Malloc disk", 00:13:57.453 "block_size": 512, 00:13:57.453 "num_blocks": 65536, 00:13:57.453 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:13:57.453 "assigned_rate_limits": { 00:13:57.453 "rw_ios_per_sec": 0, 00:13:57.453 "rw_mbytes_per_sec": 0, 00:13:57.453 "r_mbytes_per_sec": 0, 00:13:57.453 "w_mbytes_per_sec": 0 00:13:57.453 }, 00:13:57.453 "claimed": true, 00:13:57.453 "claim_type": "exclusive_write", 00:13:57.453 "zoned": false, 00:13:57.453 "supported_io_types": { 00:13:57.453 "read": true, 00:13:57.453 "write": true, 00:13:57.453 "unmap": true, 00:13:57.453 "flush": true, 00:13:57.453 "reset": true, 00:13:57.453 "nvme_admin": false, 00:13:57.453 "nvme_io": false, 00:13:57.453 "nvme_io_md": false, 00:13:57.453 "write_zeroes": true, 00:13:57.453 "zcopy": true, 00:13:57.453 "get_zone_info": false, 00:13:57.453 "zone_management": false, 00:13:57.453 "zone_append": false, 00:13:57.453 "compare": false, 00:13:57.453 "compare_and_write": false, 00:13:57.453 "abort": true, 00:13:57.453 "seek_hole": false, 00:13:57.453 "seek_data": false, 00:13:57.453 "copy": true, 00:13:57.453 "nvme_iov_md": false 00:13:57.453 }, 00:13:57.453 "memory_domains": [ 00:13:57.453 { 00:13:57.453 "dma_device_id": "system", 00:13:57.453 "dma_device_type": 1 00:13:57.453 }, 00:13:57.453 { 00:13:57.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.453 "dma_device_type": 2 00:13:57.453 } 00:13:57.453 ], 00:13:57.453 "driver_specific": {} 00:13:57.453 } 00:13:57.453 ] 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.453 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.712 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.712 "name": "Existed_Raid", 00:13:57.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.712 "strip_size_kb": 64, 00:13:57.712 "state": "configuring", 00:13:57.712 "raid_level": "raid0", 00:13:57.712 "superblock": false, 00:13:57.712 "num_base_bdevs": 3, 00:13:57.712 "num_base_bdevs_discovered": 2, 00:13:57.712 "num_base_bdevs_operational": 3, 00:13:57.712 "base_bdevs_list": [ 00:13:57.712 { 00:13:57.712 "name": "BaseBdev1", 00:13:57.712 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:13:57.712 "is_configured": true, 00:13:57.712 "data_offset": 0, 00:13:57.712 "data_size": 65536 00:13:57.712 }, 00:13:57.712 { 00:13:57.712 "name": null, 00:13:57.712 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:13:57.712 "is_configured": false, 00:13:57.712 "data_offset": 0, 00:13:57.712 "data_size": 65536 00:13:57.712 }, 00:13:57.712 { 00:13:57.712 "name": "BaseBdev3", 00:13:57.712 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:13:57.712 "is_configured": true, 00:13:57.712 "data_offset": 0, 00:13:57.712 "data_size": 65536 00:13:57.712 } 00:13:57.712 ] 00:13:57.712 }' 00:13:57.712 22:42:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.712 22:42:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.279 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.279 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:58.546 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:58.546 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:58.809 [2024-07-15 22:42:43.621452] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.810 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.067 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.067 "name": "Existed_Raid", 00:13:59.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.067 "strip_size_kb": 64, 00:13:59.067 "state": "configuring", 00:13:59.067 "raid_level": "raid0", 00:13:59.067 "superblock": false, 00:13:59.067 "num_base_bdevs": 3, 00:13:59.067 "num_base_bdevs_discovered": 1, 00:13:59.067 "num_base_bdevs_operational": 3, 00:13:59.067 "base_bdevs_list": [ 00:13:59.067 { 00:13:59.067 "name": "BaseBdev1", 00:13:59.067 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:13:59.067 "is_configured": true, 00:13:59.067 "data_offset": 0, 00:13:59.067 "data_size": 65536 00:13:59.067 }, 00:13:59.067 { 00:13:59.067 "name": null, 00:13:59.067 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:13:59.067 "is_configured": false, 00:13:59.067 "data_offset": 0, 00:13:59.067 "data_size": 65536 00:13:59.067 }, 00:13:59.067 { 00:13:59.067 "name": null, 00:13:59.067 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:13:59.067 "is_configured": false, 00:13:59.067 "data_offset": 0, 00:13:59.067 "data_size": 65536 00:13:59.067 } 00:13:59.067 ] 00:13:59.067 }' 00:13:59.067 22:42:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.067 22:42:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.633 22:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.633 22:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:59.891 22:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:59.891 22:42:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:00.153 [2024-07-15 22:42:45.001139] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.153 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.487 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.487 "name": "Existed_Raid", 00:14:00.487 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.487 "strip_size_kb": 64, 00:14:00.487 "state": "configuring", 00:14:00.487 "raid_level": "raid0", 00:14:00.487 "superblock": false, 00:14:00.487 "num_base_bdevs": 3, 00:14:00.487 "num_base_bdevs_discovered": 2, 00:14:00.487 "num_base_bdevs_operational": 3, 00:14:00.487 "base_bdevs_list": [ 00:14:00.487 { 00:14:00.487 "name": "BaseBdev1", 00:14:00.487 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:14:00.487 "is_configured": true, 00:14:00.487 "data_offset": 0, 00:14:00.487 "data_size": 65536 00:14:00.487 }, 00:14:00.487 { 00:14:00.487 "name": null, 00:14:00.487 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:14:00.487 "is_configured": false, 00:14:00.487 "data_offset": 0, 00:14:00.487 "data_size": 65536 00:14:00.487 }, 00:14:00.487 { 00:14:00.487 "name": "BaseBdev3", 00:14:00.487 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:14:00.487 "is_configured": true, 00:14:00.487 "data_offset": 0, 00:14:00.487 "data_size": 65536 00:14:00.487 } 00:14:00.487 ] 00:14:00.487 }' 00:14:00.487 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.487 22:42:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.054 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.054 22:42:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:01.313 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:01.313 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:01.572 [2024-07-15 22:42:46.304789] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.572 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.831 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.831 "name": "Existed_Raid", 00:14:01.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.831 "strip_size_kb": 64, 00:14:01.831 "state": "configuring", 00:14:01.831 "raid_level": "raid0", 00:14:01.831 "superblock": false, 00:14:01.831 "num_base_bdevs": 3, 00:14:01.831 "num_base_bdevs_discovered": 1, 00:14:01.831 "num_base_bdevs_operational": 3, 00:14:01.831 "base_bdevs_list": [ 00:14:01.831 { 00:14:01.831 "name": null, 00:14:01.831 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:14:01.831 "is_configured": false, 00:14:01.831 "data_offset": 0, 00:14:01.831 "data_size": 65536 00:14:01.831 }, 00:14:01.831 { 00:14:01.831 "name": null, 00:14:01.831 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:14:01.831 "is_configured": false, 00:14:01.831 "data_offset": 0, 00:14:01.831 "data_size": 65536 00:14:01.831 }, 00:14:01.831 { 00:14:01.831 "name": "BaseBdev3", 00:14:01.831 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:14:01.831 "is_configured": true, 00:14:01.831 "data_offset": 0, 00:14:01.831 "data_size": 65536 00:14:01.831 } 00:14:01.831 ] 00:14:01.831 }' 00:14:01.831 22:42:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.831 22:42:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.398 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:02.398 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.657 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:02.657 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:02.916 [2024-07-15 22:42:47.644146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.916 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.175 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.175 "name": "Existed_Raid", 00:14:03.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.175 "strip_size_kb": 64, 00:14:03.175 "state": "configuring", 00:14:03.175 "raid_level": "raid0", 00:14:03.175 "superblock": false, 00:14:03.175 "num_base_bdevs": 3, 00:14:03.175 "num_base_bdevs_discovered": 2, 00:14:03.175 "num_base_bdevs_operational": 3, 00:14:03.175 "base_bdevs_list": [ 00:14:03.175 { 00:14:03.175 "name": null, 00:14:03.175 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:14:03.175 "is_configured": false, 00:14:03.175 "data_offset": 0, 00:14:03.175 "data_size": 65536 00:14:03.175 }, 00:14:03.175 { 00:14:03.175 "name": "BaseBdev2", 00:14:03.175 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:14:03.175 "is_configured": true, 00:14:03.175 "data_offset": 0, 00:14:03.175 "data_size": 65536 00:14:03.175 }, 00:14:03.175 { 00:14:03.175 "name": "BaseBdev3", 00:14:03.175 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:14:03.175 "is_configured": true, 00:14:03.175 "data_offset": 0, 00:14:03.175 "data_size": 65536 00:14:03.175 } 00:14:03.175 ] 00:14:03.175 }' 00:14:03.175 22:42:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.175 22:42:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.743 22:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.743 22:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:04.002 22:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:04.002 22:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:04.002 22:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.261 22:42:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 53fbc8ec-c16d-41c2-b184-34864dd18ca2 00:14:04.520 [2024-07-15 22:42:49.191688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:04.520 [2024-07-15 22:42:49.191728] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f0c450 00:14:04.520 [2024-07-15 22:42:49.191736] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:04.520 [2024-07-15 22:42:49.191942] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f0da50 00:14:04.520 [2024-07-15 22:42:49.192058] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f0c450 00:14:04.520 [2024-07-15 22:42:49.192068] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f0c450 00:14:04.520 [2024-07-15 22:42:49.192233] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:04.520 NewBaseBdev 00:14:04.520 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:04.520 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:04.520 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:04.520 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:04.520 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:04.520 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:04.520 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.780 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:05.039 [ 00:14:05.039 { 00:14:05.039 "name": "NewBaseBdev", 00:14:05.039 "aliases": [ 00:14:05.039 "53fbc8ec-c16d-41c2-b184-34864dd18ca2" 00:14:05.039 ], 00:14:05.039 "product_name": "Malloc disk", 00:14:05.039 "block_size": 512, 00:14:05.039 "num_blocks": 65536, 00:14:05.039 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:14:05.039 "assigned_rate_limits": { 00:14:05.039 "rw_ios_per_sec": 0, 00:14:05.039 "rw_mbytes_per_sec": 0, 00:14:05.039 "r_mbytes_per_sec": 0, 00:14:05.039 "w_mbytes_per_sec": 0 00:14:05.039 }, 00:14:05.039 "claimed": true, 00:14:05.039 "claim_type": "exclusive_write", 00:14:05.039 "zoned": false, 00:14:05.039 "supported_io_types": { 00:14:05.039 "read": true, 00:14:05.039 "write": true, 00:14:05.039 "unmap": true, 00:14:05.039 "flush": true, 00:14:05.039 "reset": true, 00:14:05.039 "nvme_admin": false, 00:14:05.039 "nvme_io": false, 00:14:05.040 "nvme_io_md": false, 00:14:05.040 "write_zeroes": true, 00:14:05.040 "zcopy": true, 00:14:05.040 "get_zone_info": false, 00:14:05.040 "zone_management": false, 00:14:05.040 "zone_append": false, 00:14:05.040 "compare": false, 00:14:05.040 "compare_and_write": false, 00:14:05.040 "abort": true, 00:14:05.040 "seek_hole": false, 00:14:05.040 "seek_data": false, 00:14:05.040 "copy": true, 00:14:05.040 "nvme_iov_md": false 00:14:05.040 }, 00:14:05.040 "memory_domains": [ 00:14:05.040 { 00:14:05.040 "dma_device_id": "system", 00:14:05.040 "dma_device_type": 1 00:14:05.040 }, 00:14:05.040 { 00:14:05.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.040 "dma_device_type": 2 00:14:05.040 } 00:14:05.040 ], 00:14:05.040 "driver_specific": {} 00:14:05.040 } 00:14:05.040 ] 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.040 22:42:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.608 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.608 "name": "Existed_Raid", 00:14:05.608 "uuid": "55850bbd-98c4-4fbe-ac9a-76ac2b4b26e6", 00:14:05.608 "strip_size_kb": 64, 00:14:05.608 "state": "online", 00:14:05.608 "raid_level": "raid0", 00:14:05.608 "superblock": false, 00:14:05.608 "num_base_bdevs": 3, 00:14:05.608 "num_base_bdevs_discovered": 3, 00:14:05.608 "num_base_bdevs_operational": 3, 00:14:05.608 "base_bdevs_list": [ 00:14:05.608 { 00:14:05.608 "name": "NewBaseBdev", 00:14:05.608 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:14:05.608 "is_configured": true, 00:14:05.608 "data_offset": 0, 00:14:05.608 "data_size": 65536 00:14:05.608 }, 00:14:05.608 { 00:14:05.608 "name": "BaseBdev2", 00:14:05.608 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:14:05.608 "is_configured": true, 00:14:05.608 "data_offset": 0, 00:14:05.608 "data_size": 65536 00:14:05.608 }, 00:14:05.608 { 00:14:05.608 "name": "BaseBdev3", 00:14:05.608 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:14:05.608 "is_configured": true, 00:14:05.608 "data_offset": 0, 00:14:05.608 "data_size": 65536 00:14:05.608 } 00:14:05.608 ] 00:14:05.608 }' 00:14:05.608 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.608 22:42:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:06.176 22:42:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:06.436 [2024-07-15 22:42:51.109099] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:06.436 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:06.436 "name": "Existed_Raid", 00:14:06.436 "aliases": [ 00:14:06.436 "55850bbd-98c4-4fbe-ac9a-76ac2b4b26e6" 00:14:06.436 ], 00:14:06.436 "product_name": "Raid Volume", 00:14:06.436 "block_size": 512, 00:14:06.436 "num_blocks": 196608, 00:14:06.436 "uuid": "55850bbd-98c4-4fbe-ac9a-76ac2b4b26e6", 00:14:06.436 "assigned_rate_limits": { 00:14:06.436 "rw_ios_per_sec": 0, 00:14:06.436 "rw_mbytes_per_sec": 0, 00:14:06.436 "r_mbytes_per_sec": 0, 00:14:06.436 "w_mbytes_per_sec": 0 00:14:06.436 }, 00:14:06.436 "claimed": false, 00:14:06.436 "zoned": false, 00:14:06.436 "supported_io_types": { 00:14:06.436 "read": true, 00:14:06.436 "write": true, 00:14:06.436 "unmap": true, 00:14:06.436 "flush": true, 00:14:06.436 "reset": true, 00:14:06.436 "nvme_admin": false, 00:14:06.436 "nvme_io": false, 00:14:06.436 "nvme_io_md": false, 00:14:06.436 "write_zeroes": true, 00:14:06.436 "zcopy": false, 00:14:06.436 "get_zone_info": false, 00:14:06.436 "zone_management": false, 00:14:06.436 "zone_append": false, 00:14:06.436 "compare": false, 00:14:06.436 "compare_and_write": false, 00:14:06.436 "abort": false, 00:14:06.436 "seek_hole": false, 00:14:06.436 "seek_data": false, 00:14:06.436 "copy": false, 00:14:06.436 "nvme_iov_md": false 00:14:06.436 }, 00:14:06.436 "memory_domains": [ 00:14:06.436 { 00:14:06.436 "dma_device_id": "system", 00:14:06.436 "dma_device_type": 1 00:14:06.436 }, 00:14:06.436 { 00:14:06.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.436 "dma_device_type": 2 00:14:06.436 }, 00:14:06.436 { 00:14:06.436 "dma_device_id": "system", 00:14:06.436 "dma_device_type": 1 00:14:06.436 }, 00:14:06.436 { 00:14:06.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.436 "dma_device_type": 2 00:14:06.436 }, 00:14:06.436 { 00:14:06.436 "dma_device_id": "system", 00:14:06.436 "dma_device_type": 1 00:14:06.436 }, 00:14:06.436 { 00:14:06.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.436 "dma_device_type": 2 00:14:06.436 } 00:14:06.436 ], 00:14:06.436 "driver_specific": { 00:14:06.436 "raid": { 00:14:06.436 "uuid": "55850bbd-98c4-4fbe-ac9a-76ac2b4b26e6", 00:14:06.436 "strip_size_kb": 64, 00:14:06.436 "state": "online", 00:14:06.436 "raid_level": "raid0", 00:14:06.436 "superblock": false, 00:14:06.436 "num_base_bdevs": 3, 00:14:06.436 "num_base_bdevs_discovered": 3, 00:14:06.436 "num_base_bdevs_operational": 3, 00:14:06.436 "base_bdevs_list": [ 00:14:06.436 { 00:14:06.436 "name": "NewBaseBdev", 00:14:06.436 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:14:06.436 "is_configured": true, 00:14:06.436 "data_offset": 0, 00:14:06.436 "data_size": 65536 00:14:06.436 }, 00:14:06.436 { 00:14:06.436 "name": "BaseBdev2", 00:14:06.436 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:14:06.436 "is_configured": true, 00:14:06.436 "data_offset": 0, 00:14:06.436 "data_size": 65536 00:14:06.436 }, 00:14:06.436 { 00:14:06.436 "name": "BaseBdev3", 00:14:06.436 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:14:06.436 "is_configured": true, 00:14:06.436 "data_offset": 0, 00:14:06.436 "data_size": 65536 00:14:06.436 } 00:14:06.436 ] 00:14:06.436 } 00:14:06.436 } 00:14:06.436 }' 00:14:06.436 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:06.436 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:06.436 BaseBdev2 00:14:06.436 BaseBdev3' 00:14:06.436 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.436 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:06.436 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.695 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.695 "name": "NewBaseBdev", 00:14:06.695 "aliases": [ 00:14:06.695 "53fbc8ec-c16d-41c2-b184-34864dd18ca2" 00:14:06.695 ], 00:14:06.695 "product_name": "Malloc disk", 00:14:06.695 "block_size": 512, 00:14:06.695 "num_blocks": 65536, 00:14:06.695 "uuid": "53fbc8ec-c16d-41c2-b184-34864dd18ca2", 00:14:06.695 "assigned_rate_limits": { 00:14:06.695 "rw_ios_per_sec": 0, 00:14:06.695 "rw_mbytes_per_sec": 0, 00:14:06.695 "r_mbytes_per_sec": 0, 00:14:06.695 "w_mbytes_per_sec": 0 00:14:06.695 }, 00:14:06.695 "claimed": true, 00:14:06.695 "claim_type": "exclusive_write", 00:14:06.695 "zoned": false, 00:14:06.695 "supported_io_types": { 00:14:06.695 "read": true, 00:14:06.695 "write": true, 00:14:06.695 "unmap": true, 00:14:06.695 "flush": true, 00:14:06.695 "reset": true, 00:14:06.695 "nvme_admin": false, 00:14:06.695 "nvme_io": false, 00:14:06.695 "nvme_io_md": false, 00:14:06.695 "write_zeroes": true, 00:14:06.695 "zcopy": true, 00:14:06.695 "get_zone_info": false, 00:14:06.695 "zone_management": false, 00:14:06.695 "zone_append": false, 00:14:06.695 "compare": false, 00:14:06.695 "compare_and_write": false, 00:14:06.695 "abort": true, 00:14:06.695 "seek_hole": false, 00:14:06.695 "seek_data": false, 00:14:06.695 "copy": true, 00:14:06.695 "nvme_iov_md": false 00:14:06.695 }, 00:14:06.695 "memory_domains": [ 00:14:06.695 { 00:14:06.695 "dma_device_id": "system", 00:14:06.695 "dma_device_type": 1 00:14:06.695 }, 00:14:06.695 { 00:14:06.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.695 "dma_device_type": 2 00:14:06.695 } 00:14:06.695 ], 00:14:06.695 "driver_specific": {} 00:14:06.695 }' 00:14:06.695 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.696 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.696 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.696 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:06.954 22:42:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.212 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.212 "name": "BaseBdev2", 00:14:07.212 "aliases": [ 00:14:07.212 "bad9c462-8885-492d-bafb-1db5ddeb6b73" 00:14:07.212 ], 00:14:07.212 "product_name": "Malloc disk", 00:14:07.212 "block_size": 512, 00:14:07.212 "num_blocks": 65536, 00:14:07.212 "uuid": "bad9c462-8885-492d-bafb-1db5ddeb6b73", 00:14:07.212 "assigned_rate_limits": { 00:14:07.212 "rw_ios_per_sec": 0, 00:14:07.212 "rw_mbytes_per_sec": 0, 00:14:07.212 "r_mbytes_per_sec": 0, 00:14:07.212 "w_mbytes_per_sec": 0 00:14:07.212 }, 00:14:07.212 "claimed": true, 00:14:07.212 "claim_type": "exclusive_write", 00:14:07.212 "zoned": false, 00:14:07.212 "supported_io_types": { 00:14:07.212 "read": true, 00:14:07.212 "write": true, 00:14:07.212 "unmap": true, 00:14:07.212 "flush": true, 00:14:07.212 "reset": true, 00:14:07.212 "nvme_admin": false, 00:14:07.212 "nvme_io": false, 00:14:07.212 "nvme_io_md": false, 00:14:07.212 "write_zeroes": true, 00:14:07.212 "zcopy": true, 00:14:07.212 "get_zone_info": false, 00:14:07.212 "zone_management": false, 00:14:07.212 "zone_append": false, 00:14:07.213 "compare": false, 00:14:07.213 "compare_and_write": false, 00:14:07.213 "abort": true, 00:14:07.213 "seek_hole": false, 00:14:07.213 "seek_data": false, 00:14:07.213 "copy": true, 00:14:07.213 "nvme_iov_md": false 00:14:07.213 }, 00:14:07.213 "memory_domains": [ 00:14:07.213 { 00:14:07.213 "dma_device_id": "system", 00:14:07.213 "dma_device_type": 1 00:14:07.213 }, 00:14:07.213 { 00:14:07.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.213 "dma_device_type": 2 00:14:07.213 } 00:14:07.213 ], 00:14:07.213 "driver_specific": {} 00:14:07.213 }' 00:14:07.213 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.471 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.730 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.731 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.731 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:07.731 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:07.731 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.989 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.989 "name": "BaseBdev3", 00:14:07.989 "aliases": [ 00:14:07.989 "f1ed75e4-0719-4112-a5f8-f8f640a73afb" 00:14:07.989 ], 00:14:07.989 "product_name": "Malloc disk", 00:14:07.989 "block_size": 512, 00:14:07.989 "num_blocks": 65536, 00:14:07.989 "uuid": "f1ed75e4-0719-4112-a5f8-f8f640a73afb", 00:14:07.989 "assigned_rate_limits": { 00:14:07.989 "rw_ios_per_sec": 0, 00:14:07.989 "rw_mbytes_per_sec": 0, 00:14:07.989 "r_mbytes_per_sec": 0, 00:14:07.989 "w_mbytes_per_sec": 0 00:14:07.989 }, 00:14:07.989 "claimed": true, 00:14:07.989 "claim_type": "exclusive_write", 00:14:07.989 "zoned": false, 00:14:07.989 "supported_io_types": { 00:14:07.989 "read": true, 00:14:07.989 "write": true, 00:14:07.989 "unmap": true, 00:14:07.989 "flush": true, 00:14:07.989 "reset": true, 00:14:07.989 "nvme_admin": false, 00:14:07.989 "nvme_io": false, 00:14:07.989 "nvme_io_md": false, 00:14:07.989 "write_zeroes": true, 00:14:07.989 "zcopy": true, 00:14:07.989 "get_zone_info": false, 00:14:07.989 "zone_management": false, 00:14:07.989 "zone_append": false, 00:14:07.990 "compare": false, 00:14:07.990 "compare_and_write": false, 00:14:07.990 "abort": true, 00:14:07.990 "seek_hole": false, 00:14:07.990 "seek_data": false, 00:14:07.990 "copy": true, 00:14:07.990 "nvme_iov_md": false 00:14:07.990 }, 00:14:07.990 "memory_domains": [ 00:14:07.990 { 00:14:07.990 "dma_device_id": "system", 00:14:07.990 "dma_device_type": 1 00:14:07.990 }, 00:14:07.990 { 00:14:07.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.990 "dma_device_type": 2 00:14:07.990 } 00:14:07.990 ], 00:14:07.990 "driver_specific": {} 00:14:07.990 }' 00:14:07.990 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.990 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.990 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.990 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.990 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.990 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.990 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.249 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:08.249 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:08.249 22:42:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.249 22:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:08.249 22:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:08.249 22:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:08.513 [2024-07-15 22:42:53.258512] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:08.513 [2024-07-15 22:42:53.258541] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:08.513 [2024-07-15 22:42:53.258597] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:08.513 [2024-07-15 22:42:53.258650] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:08.513 [2024-07-15 22:42:53.258663] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0c450 name Existed_Raid, state offline 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2719261 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2719261 ']' 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2719261 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2719261 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2719261' 00:14:08.513 killing process with pid 2719261 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2719261 00:14:08.513 [2024-07-15 22:42:53.324260] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:08.513 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2719261 00:14:08.513 [2024-07-15 22:42:53.355397] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:08.774 00:14:08.774 real 0m29.398s 00:14:08.774 user 0m53.891s 00:14:08.774 sys 0m5.315s 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.774 ************************************ 00:14:08.774 END TEST raid_state_function_test 00:14:08.774 ************************************ 00:14:08.774 22:42:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:08.774 22:42:53 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:14:08.774 22:42:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:08.774 22:42:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:08.774 22:42:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:08.774 ************************************ 00:14:08.774 START TEST raid_state_function_test_sb 00:14:08.774 ************************************ 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:08.774 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2723670 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2723670' 00:14:08.775 Process raid pid: 2723670 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2723670 /var/tmp/spdk-raid.sock 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2723670 ']' 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:08.775 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:08.775 22:42:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:09.034 [2024-07-15 22:42:53.740025] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:14:09.034 [2024-07-15 22:42:53.740093] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:09.034 [2024-07-15 22:42:53.870336] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:09.292 [2024-07-15 22:42:53.968788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:09.292 [2024-07-15 22:42:54.027727] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:09.292 [2024-07-15 22:42:54.027760] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:10.229 22:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:10.229 22:42:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:10.229 22:42:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:10.489 [2024-07-15 22:42:55.165064] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:10.489 [2024-07-15 22:42:55.165107] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:10.489 [2024-07-15 22:42:55.165117] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:10.489 [2024-07-15 22:42:55.165129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:10.489 [2024-07-15 22:42:55.165138] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:10.489 [2024-07-15 22:42:55.165149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.489 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.748 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.748 "name": "Existed_Raid", 00:14:10.748 "uuid": "3c5c481a-39ef-4c3c-a664-71d597e356f3", 00:14:10.748 "strip_size_kb": 64, 00:14:10.748 "state": "configuring", 00:14:10.748 "raid_level": "raid0", 00:14:10.748 "superblock": true, 00:14:10.748 "num_base_bdevs": 3, 00:14:10.748 "num_base_bdevs_discovered": 0, 00:14:10.748 "num_base_bdevs_operational": 3, 00:14:10.748 "base_bdevs_list": [ 00:14:10.748 { 00:14:10.748 "name": "BaseBdev1", 00:14:10.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.748 "is_configured": false, 00:14:10.748 "data_offset": 0, 00:14:10.748 "data_size": 0 00:14:10.748 }, 00:14:10.748 { 00:14:10.748 "name": "BaseBdev2", 00:14:10.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.748 "is_configured": false, 00:14:10.748 "data_offset": 0, 00:14:10.748 "data_size": 0 00:14:10.748 }, 00:14:10.748 { 00:14:10.748 "name": "BaseBdev3", 00:14:10.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.748 "is_configured": false, 00:14:10.748 "data_offset": 0, 00:14:10.748 "data_size": 0 00:14:10.748 } 00:14:10.748 ] 00:14:10.748 }' 00:14:10.748 22:42:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.748 22:42:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:11.315 22:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:11.315 [2024-07-15 22:42:56.191621] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:11.316 [2024-07-15 22:42:56.191652] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23f2a80 name Existed_Raid, state configuring 00:14:11.316 22:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:11.574 [2024-07-15 22:42:56.440302] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:11.574 [2024-07-15 22:42:56.440329] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:11.574 [2024-07-15 22:42:56.440338] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:11.574 [2024-07-15 22:42:56.440350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:11.574 [2024-07-15 22:42:56.440358] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:11.574 [2024-07-15 22:42:56.440369] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:11.574 22:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:11.833 [2024-07-15 22:42:56.698830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:11.833 BaseBdev1 00:14:11.833 22:42:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:11.833 22:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:11.833 22:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:11.833 22:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:11.833 22:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:11.833 22:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:11.833 22:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:12.092 22:42:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:12.350 [ 00:14:12.350 { 00:14:12.350 "name": "BaseBdev1", 00:14:12.350 "aliases": [ 00:14:12.350 "ea6077a7-98af-429c-88bd-e2b7d92515bb" 00:14:12.350 ], 00:14:12.350 "product_name": "Malloc disk", 00:14:12.350 "block_size": 512, 00:14:12.350 "num_blocks": 65536, 00:14:12.350 "uuid": "ea6077a7-98af-429c-88bd-e2b7d92515bb", 00:14:12.350 "assigned_rate_limits": { 00:14:12.350 "rw_ios_per_sec": 0, 00:14:12.350 "rw_mbytes_per_sec": 0, 00:14:12.350 "r_mbytes_per_sec": 0, 00:14:12.350 "w_mbytes_per_sec": 0 00:14:12.350 }, 00:14:12.351 "claimed": true, 00:14:12.351 "claim_type": "exclusive_write", 00:14:12.351 "zoned": false, 00:14:12.351 "supported_io_types": { 00:14:12.351 "read": true, 00:14:12.351 "write": true, 00:14:12.351 "unmap": true, 00:14:12.351 "flush": true, 00:14:12.351 "reset": true, 00:14:12.351 "nvme_admin": false, 00:14:12.351 "nvme_io": false, 00:14:12.351 "nvme_io_md": false, 00:14:12.351 "write_zeroes": true, 00:14:12.351 "zcopy": true, 00:14:12.351 "get_zone_info": false, 00:14:12.351 "zone_management": false, 00:14:12.351 "zone_append": false, 00:14:12.351 "compare": false, 00:14:12.351 "compare_and_write": false, 00:14:12.351 "abort": true, 00:14:12.351 "seek_hole": false, 00:14:12.351 "seek_data": false, 00:14:12.351 "copy": true, 00:14:12.351 "nvme_iov_md": false 00:14:12.351 }, 00:14:12.351 "memory_domains": [ 00:14:12.351 { 00:14:12.351 "dma_device_id": "system", 00:14:12.351 "dma_device_type": 1 00:14:12.351 }, 00:14:12.351 { 00:14:12.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.351 "dma_device_type": 2 00:14:12.351 } 00:14:12.351 ], 00:14:12.351 "driver_specific": {} 00:14:12.351 } 00:14:12.351 ] 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.351 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.610 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.610 "name": "Existed_Raid", 00:14:12.610 "uuid": "3682ec82-2625-4943-b560-7881f585aa7e", 00:14:12.610 "strip_size_kb": 64, 00:14:12.610 "state": "configuring", 00:14:12.610 "raid_level": "raid0", 00:14:12.610 "superblock": true, 00:14:12.610 "num_base_bdevs": 3, 00:14:12.610 "num_base_bdevs_discovered": 1, 00:14:12.610 "num_base_bdevs_operational": 3, 00:14:12.610 "base_bdevs_list": [ 00:14:12.610 { 00:14:12.610 "name": "BaseBdev1", 00:14:12.610 "uuid": "ea6077a7-98af-429c-88bd-e2b7d92515bb", 00:14:12.610 "is_configured": true, 00:14:12.610 "data_offset": 2048, 00:14:12.610 "data_size": 63488 00:14:12.610 }, 00:14:12.610 { 00:14:12.610 "name": "BaseBdev2", 00:14:12.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.610 "is_configured": false, 00:14:12.610 "data_offset": 0, 00:14:12.610 "data_size": 0 00:14:12.610 }, 00:14:12.610 { 00:14:12.610 "name": "BaseBdev3", 00:14:12.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.610 "is_configured": false, 00:14:12.610 "data_offset": 0, 00:14:12.610 "data_size": 0 00:14:12.610 } 00:14:12.610 ] 00:14:12.610 }' 00:14:12.610 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.610 22:42:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.178 22:42:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:13.178 [2024-07-15 22:42:58.082499] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:13.178 [2024-07-15 22:42:58.082536] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23f2310 name Existed_Raid, state configuring 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:13.437 [2024-07-15 22:42:58.263028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:13.437 [2024-07-15 22:42:58.264451] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:13.437 [2024-07-15 22:42:58.264480] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:13.437 [2024-07-15 22:42:58.264491] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:13.437 [2024-07-15 22:42:58.264502] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.437 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.696 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.696 "name": "Existed_Raid", 00:14:13.696 "uuid": "618e58a6-99b5-4639-a350-e323775bf0c2", 00:14:13.696 "strip_size_kb": 64, 00:14:13.696 "state": "configuring", 00:14:13.696 "raid_level": "raid0", 00:14:13.696 "superblock": true, 00:14:13.696 "num_base_bdevs": 3, 00:14:13.696 "num_base_bdevs_discovered": 1, 00:14:13.696 "num_base_bdevs_operational": 3, 00:14:13.696 "base_bdevs_list": [ 00:14:13.696 { 00:14:13.696 "name": "BaseBdev1", 00:14:13.696 "uuid": "ea6077a7-98af-429c-88bd-e2b7d92515bb", 00:14:13.696 "is_configured": true, 00:14:13.696 "data_offset": 2048, 00:14:13.696 "data_size": 63488 00:14:13.696 }, 00:14:13.696 { 00:14:13.696 "name": "BaseBdev2", 00:14:13.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.696 "is_configured": false, 00:14:13.696 "data_offset": 0, 00:14:13.696 "data_size": 0 00:14:13.696 }, 00:14:13.696 { 00:14:13.696 "name": "BaseBdev3", 00:14:13.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.696 "is_configured": false, 00:14:13.696 "data_offset": 0, 00:14:13.696 "data_size": 0 00:14:13.696 } 00:14:13.696 ] 00:14:13.696 }' 00:14:13.696 22:42:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.696 22:42:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:14.634 [2024-07-15 22:42:59.405485] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:14.634 BaseBdev2 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:14.634 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:14.920 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:15.198 [ 00:14:15.198 { 00:14:15.198 "name": "BaseBdev2", 00:14:15.198 "aliases": [ 00:14:15.198 "2beb6066-7de2-4b16-b027-1cf9b706668f" 00:14:15.198 ], 00:14:15.198 "product_name": "Malloc disk", 00:14:15.198 "block_size": 512, 00:14:15.198 "num_blocks": 65536, 00:14:15.198 "uuid": "2beb6066-7de2-4b16-b027-1cf9b706668f", 00:14:15.198 "assigned_rate_limits": { 00:14:15.198 "rw_ios_per_sec": 0, 00:14:15.198 "rw_mbytes_per_sec": 0, 00:14:15.198 "r_mbytes_per_sec": 0, 00:14:15.198 "w_mbytes_per_sec": 0 00:14:15.198 }, 00:14:15.198 "claimed": true, 00:14:15.198 "claim_type": "exclusive_write", 00:14:15.198 "zoned": false, 00:14:15.198 "supported_io_types": { 00:14:15.198 "read": true, 00:14:15.198 "write": true, 00:14:15.198 "unmap": true, 00:14:15.198 "flush": true, 00:14:15.198 "reset": true, 00:14:15.198 "nvme_admin": false, 00:14:15.198 "nvme_io": false, 00:14:15.198 "nvme_io_md": false, 00:14:15.198 "write_zeroes": true, 00:14:15.198 "zcopy": true, 00:14:15.198 "get_zone_info": false, 00:14:15.198 "zone_management": false, 00:14:15.198 "zone_append": false, 00:14:15.198 "compare": false, 00:14:15.198 "compare_and_write": false, 00:14:15.198 "abort": true, 00:14:15.198 "seek_hole": false, 00:14:15.198 "seek_data": false, 00:14:15.198 "copy": true, 00:14:15.198 "nvme_iov_md": false 00:14:15.198 }, 00:14:15.198 "memory_domains": [ 00:14:15.198 { 00:14:15.198 "dma_device_id": "system", 00:14:15.198 "dma_device_type": 1 00:14:15.198 }, 00:14:15.198 { 00:14:15.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.198 "dma_device_type": 2 00:14:15.198 } 00:14:15.198 ], 00:14:15.198 "driver_specific": {} 00:14:15.198 } 00:14:15.198 ] 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.198 22:42:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.458 22:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.458 "name": "Existed_Raid", 00:14:15.458 "uuid": "618e58a6-99b5-4639-a350-e323775bf0c2", 00:14:15.458 "strip_size_kb": 64, 00:14:15.458 "state": "configuring", 00:14:15.458 "raid_level": "raid0", 00:14:15.458 "superblock": true, 00:14:15.458 "num_base_bdevs": 3, 00:14:15.458 "num_base_bdevs_discovered": 2, 00:14:15.458 "num_base_bdevs_operational": 3, 00:14:15.458 "base_bdevs_list": [ 00:14:15.458 { 00:14:15.458 "name": "BaseBdev1", 00:14:15.458 "uuid": "ea6077a7-98af-429c-88bd-e2b7d92515bb", 00:14:15.458 "is_configured": true, 00:14:15.458 "data_offset": 2048, 00:14:15.458 "data_size": 63488 00:14:15.458 }, 00:14:15.458 { 00:14:15.458 "name": "BaseBdev2", 00:14:15.458 "uuid": "2beb6066-7de2-4b16-b027-1cf9b706668f", 00:14:15.458 "is_configured": true, 00:14:15.458 "data_offset": 2048, 00:14:15.458 "data_size": 63488 00:14:15.458 }, 00:14:15.458 { 00:14:15.458 "name": "BaseBdev3", 00:14:15.458 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.458 "is_configured": false, 00:14:15.458 "data_offset": 0, 00:14:15.458 "data_size": 0 00:14:15.458 } 00:14:15.458 ] 00:14:15.458 }' 00:14:15.458 22:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.458 22:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:16.027 22:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:16.286 [2024-07-15 22:43:00.977042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:16.286 [2024-07-15 22:43:00.977210] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23f3400 00:14:16.286 [2024-07-15 22:43:00.977224] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:16.286 [2024-07-15 22:43:00.977397] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23f2ef0 00:14:16.286 [2024-07-15 22:43:00.977512] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23f3400 00:14:16.286 [2024-07-15 22:43:00.977522] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23f3400 00:14:16.286 [2024-07-15 22:43:00.977611] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:16.286 BaseBdev3 00:14:16.286 22:43:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:16.286 22:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:16.286 22:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:16.286 22:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:16.286 22:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:16.286 22:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:16.286 22:43:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:16.544 22:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:16.804 [ 00:14:16.804 { 00:14:16.804 "name": "BaseBdev3", 00:14:16.804 "aliases": [ 00:14:16.804 "4947be0c-249b-4fb1-b8b7-70240185f80d" 00:14:16.804 ], 00:14:16.804 "product_name": "Malloc disk", 00:14:16.804 "block_size": 512, 00:14:16.804 "num_blocks": 65536, 00:14:16.804 "uuid": "4947be0c-249b-4fb1-b8b7-70240185f80d", 00:14:16.804 "assigned_rate_limits": { 00:14:16.804 "rw_ios_per_sec": 0, 00:14:16.804 "rw_mbytes_per_sec": 0, 00:14:16.804 "r_mbytes_per_sec": 0, 00:14:16.804 "w_mbytes_per_sec": 0 00:14:16.804 }, 00:14:16.804 "claimed": true, 00:14:16.804 "claim_type": "exclusive_write", 00:14:16.804 "zoned": false, 00:14:16.804 "supported_io_types": { 00:14:16.804 "read": true, 00:14:16.804 "write": true, 00:14:16.804 "unmap": true, 00:14:16.804 "flush": true, 00:14:16.804 "reset": true, 00:14:16.804 "nvme_admin": false, 00:14:16.804 "nvme_io": false, 00:14:16.804 "nvme_io_md": false, 00:14:16.804 "write_zeroes": true, 00:14:16.804 "zcopy": true, 00:14:16.804 "get_zone_info": false, 00:14:16.804 "zone_management": false, 00:14:16.804 "zone_append": false, 00:14:16.804 "compare": false, 00:14:16.804 "compare_and_write": false, 00:14:16.804 "abort": true, 00:14:16.804 "seek_hole": false, 00:14:16.804 "seek_data": false, 00:14:16.804 "copy": true, 00:14:16.804 "nvme_iov_md": false 00:14:16.804 }, 00:14:16.804 "memory_domains": [ 00:14:16.804 { 00:14:16.804 "dma_device_id": "system", 00:14:16.804 "dma_device_type": 1 00:14:16.804 }, 00:14:16.804 { 00:14:16.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.804 "dma_device_type": 2 00:14:16.804 } 00:14:16.804 ], 00:14:16.804 "driver_specific": {} 00:14:16.804 } 00:14:16.804 ] 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.804 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.063 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.063 "name": "Existed_Raid", 00:14:17.063 "uuid": "618e58a6-99b5-4639-a350-e323775bf0c2", 00:14:17.063 "strip_size_kb": 64, 00:14:17.063 "state": "online", 00:14:17.063 "raid_level": "raid0", 00:14:17.063 "superblock": true, 00:14:17.063 "num_base_bdevs": 3, 00:14:17.063 "num_base_bdevs_discovered": 3, 00:14:17.063 "num_base_bdevs_operational": 3, 00:14:17.063 "base_bdevs_list": [ 00:14:17.063 { 00:14:17.063 "name": "BaseBdev1", 00:14:17.063 "uuid": "ea6077a7-98af-429c-88bd-e2b7d92515bb", 00:14:17.063 "is_configured": true, 00:14:17.063 "data_offset": 2048, 00:14:17.063 "data_size": 63488 00:14:17.063 }, 00:14:17.063 { 00:14:17.063 "name": "BaseBdev2", 00:14:17.063 "uuid": "2beb6066-7de2-4b16-b027-1cf9b706668f", 00:14:17.063 "is_configured": true, 00:14:17.063 "data_offset": 2048, 00:14:17.063 "data_size": 63488 00:14:17.063 }, 00:14:17.063 { 00:14:17.063 "name": "BaseBdev3", 00:14:17.063 "uuid": "4947be0c-249b-4fb1-b8b7-70240185f80d", 00:14:17.063 "is_configured": true, 00:14:17.063 "data_offset": 2048, 00:14:17.063 "data_size": 63488 00:14:17.063 } 00:14:17.063 ] 00:14:17.063 }' 00:14:17.063 22:43:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.063 22:43:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:17.632 [2024-07-15 22:43:02.461274] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:17.632 "name": "Existed_Raid", 00:14:17.632 "aliases": [ 00:14:17.632 "618e58a6-99b5-4639-a350-e323775bf0c2" 00:14:17.632 ], 00:14:17.632 "product_name": "Raid Volume", 00:14:17.632 "block_size": 512, 00:14:17.632 "num_blocks": 190464, 00:14:17.632 "uuid": "618e58a6-99b5-4639-a350-e323775bf0c2", 00:14:17.632 "assigned_rate_limits": { 00:14:17.632 "rw_ios_per_sec": 0, 00:14:17.632 "rw_mbytes_per_sec": 0, 00:14:17.632 "r_mbytes_per_sec": 0, 00:14:17.632 "w_mbytes_per_sec": 0 00:14:17.632 }, 00:14:17.632 "claimed": false, 00:14:17.632 "zoned": false, 00:14:17.632 "supported_io_types": { 00:14:17.632 "read": true, 00:14:17.632 "write": true, 00:14:17.632 "unmap": true, 00:14:17.632 "flush": true, 00:14:17.632 "reset": true, 00:14:17.632 "nvme_admin": false, 00:14:17.632 "nvme_io": false, 00:14:17.632 "nvme_io_md": false, 00:14:17.632 "write_zeroes": true, 00:14:17.632 "zcopy": false, 00:14:17.632 "get_zone_info": false, 00:14:17.632 "zone_management": false, 00:14:17.632 "zone_append": false, 00:14:17.632 "compare": false, 00:14:17.632 "compare_and_write": false, 00:14:17.632 "abort": false, 00:14:17.632 "seek_hole": false, 00:14:17.632 "seek_data": false, 00:14:17.632 "copy": false, 00:14:17.632 "nvme_iov_md": false 00:14:17.632 }, 00:14:17.632 "memory_domains": [ 00:14:17.632 { 00:14:17.632 "dma_device_id": "system", 00:14:17.632 "dma_device_type": 1 00:14:17.632 }, 00:14:17.632 { 00:14:17.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.632 "dma_device_type": 2 00:14:17.632 }, 00:14:17.632 { 00:14:17.632 "dma_device_id": "system", 00:14:17.632 "dma_device_type": 1 00:14:17.632 }, 00:14:17.632 { 00:14:17.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.632 "dma_device_type": 2 00:14:17.632 }, 00:14:17.632 { 00:14:17.632 "dma_device_id": "system", 00:14:17.632 "dma_device_type": 1 00:14:17.632 }, 00:14:17.632 { 00:14:17.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.632 "dma_device_type": 2 00:14:17.632 } 00:14:17.632 ], 00:14:17.632 "driver_specific": { 00:14:17.632 "raid": { 00:14:17.632 "uuid": "618e58a6-99b5-4639-a350-e323775bf0c2", 00:14:17.632 "strip_size_kb": 64, 00:14:17.632 "state": "online", 00:14:17.632 "raid_level": "raid0", 00:14:17.632 "superblock": true, 00:14:17.632 "num_base_bdevs": 3, 00:14:17.632 "num_base_bdevs_discovered": 3, 00:14:17.632 "num_base_bdevs_operational": 3, 00:14:17.632 "base_bdevs_list": [ 00:14:17.632 { 00:14:17.632 "name": "BaseBdev1", 00:14:17.632 "uuid": "ea6077a7-98af-429c-88bd-e2b7d92515bb", 00:14:17.632 "is_configured": true, 00:14:17.632 "data_offset": 2048, 00:14:17.632 "data_size": 63488 00:14:17.632 }, 00:14:17.632 { 00:14:17.632 "name": "BaseBdev2", 00:14:17.632 "uuid": "2beb6066-7de2-4b16-b027-1cf9b706668f", 00:14:17.632 "is_configured": true, 00:14:17.632 "data_offset": 2048, 00:14:17.632 "data_size": 63488 00:14:17.632 }, 00:14:17.632 { 00:14:17.632 "name": "BaseBdev3", 00:14:17.632 "uuid": "4947be0c-249b-4fb1-b8b7-70240185f80d", 00:14:17.632 "is_configured": true, 00:14:17.632 "data_offset": 2048, 00:14:17.632 "data_size": 63488 00:14:17.632 } 00:14:17.632 ] 00:14:17.632 } 00:14:17.632 } 00:14:17.632 }' 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:17.632 BaseBdev2 00:14:17.632 BaseBdev3' 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:17.632 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.891 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.891 "name": "BaseBdev1", 00:14:17.891 "aliases": [ 00:14:17.891 "ea6077a7-98af-429c-88bd-e2b7d92515bb" 00:14:17.891 ], 00:14:17.891 "product_name": "Malloc disk", 00:14:17.891 "block_size": 512, 00:14:17.891 "num_blocks": 65536, 00:14:17.891 "uuid": "ea6077a7-98af-429c-88bd-e2b7d92515bb", 00:14:17.891 "assigned_rate_limits": { 00:14:17.891 "rw_ios_per_sec": 0, 00:14:17.891 "rw_mbytes_per_sec": 0, 00:14:17.891 "r_mbytes_per_sec": 0, 00:14:17.891 "w_mbytes_per_sec": 0 00:14:17.891 }, 00:14:17.891 "claimed": true, 00:14:17.891 "claim_type": "exclusive_write", 00:14:17.891 "zoned": false, 00:14:17.891 "supported_io_types": { 00:14:17.891 "read": true, 00:14:17.891 "write": true, 00:14:17.891 "unmap": true, 00:14:17.891 "flush": true, 00:14:17.891 "reset": true, 00:14:17.891 "nvme_admin": false, 00:14:17.891 "nvme_io": false, 00:14:17.891 "nvme_io_md": false, 00:14:17.891 "write_zeroes": true, 00:14:17.891 "zcopy": true, 00:14:17.891 "get_zone_info": false, 00:14:17.891 "zone_management": false, 00:14:17.891 "zone_append": false, 00:14:17.891 "compare": false, 00:14:17.891 "compare_and_write": false, 00:14:17.891 "abort": true, 00:14:17.891 "seek_hole": false, 00:14:17.891 "seek_data": false, 00:14:17.891 "copy": true, 00:14:17.891 "nvme_iov_md": false 00:14:17.891 }, 00:14:17.891 "memory_domains": [ 00:14:17.891 { 00:14:17.891 "dma_device_id": "system", 00:14:17.891 "dma_device_type": 1 00:14:17.891 }, 00:14:17.891 { 00:14:17.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.891 "dma_device_type": 2 00:14:17.891 } 00:14:17.891 ], 00:14:17.891 "driver_specific": {} 00:14:17.891 }' 00:14:17.891 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.150 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.150 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.150 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.150 22:43:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.150 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.150 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.150 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.409 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.409 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.409 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.409 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.409 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.409 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:18.409 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.667 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.667 "name": "BaseBdev2", 00:14:18.667 "aliases": [ 00:14:18.667 "2beb6066-7de2-4b16-b027-1cf9b706668f" 00:14:18.667 ], 00:14:18.667 "product_name": "Malloc disk", 00:14:18.667 "block_size": 512, 00:14:18.667 "num_blocks": 65536, 00:14:18.667 "uuid": "2beb6066-7de2-4b16-b027-1cf9b706668f", 00:14:18.667 "assigned_rate_limits": { 00:14:18.667 "rw_ios_per_sec": 0, 00:14:18.667 "rw_mbytes_per_sec": 0, 00:14:18.667 "r_mbytes_per_sec": 0, 00:14:18.667 "w_mbytes_per_sec": 0 00:14:18.667 }, 00:14:18.667 "claimed": true, 00:14:18.667 "claim_type": "exclusive_write", 00:14:18.667 "zoned": false, 00:14:18.667 "supported_io_types": { 00:14:18.667 "read": true, 00:14:18.667 "write": true, 00:14:18.667 "unmap": true, 00:14:18.667 "flush": true, 00:14:18.667 "reset": true, 00:14:18.667 "nvme_admin": false, 00:14:18.667 "nvme_io": false, 00:14:18.667 "nvme_io_md": false, 00:14:18.667 "write_zeroes": true, 00:14:18.667 "zcopy": true, 00:14:18.667 "get_zone_info": false, 00:14:18.667 "zone_management": false, 00:14:18.667 "zone_append": false, 00:14:18.667 "compare": false, 00:14:18.667 "compare_and_write": false, 00:14:18.667 "abort": true, 00:14:18.667 "seek_hole": false, 00:14:18.667 "seek_data": false, 00:14:18.667 "copy": true, 00:14:18.667 "nvme_iov_md": false 00:14:18.667 }, 00:14:18.667 "memory_domains": [ 00:14:18.667 { 00:14:18.667 "dma_device_id": "system", 00:14:18.667 "dma_device_type": 1 00:14:18.667 }, 00:14:18.667 { 00:14:18.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.667 "dma_device_type": 2 00:14:18.667 } 00:14:18.667 ], 00:14:18.667 "driver_specific": {} 00:14:18.667 }' 00:14:18.667 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.667 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.667 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.667 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.925 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.925 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.925 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.925 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.925 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.925 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.925 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.184 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.184 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:19.184 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:19.184 22:43:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:19.442 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:19.442 "name": "BaseBdev3", 00:14:19.442 "aliases": [ 00:14:19.442 "4947be0c-249b-4fb1-b8b7-70240185f80d" 00:14:19.442 ], 00:14:19.442 "product_name": "Malloc disk", 00:14:19.442 "block_size": 512, 00:14:19.442 "num_blocks": 65536, 00:14:19.442 "uuid": "4947be0c-249b-4fb1-b8b7-70240185f80d", 00:14:19.442 "assigned_rate_limits": { 00:14:19.442 "rw_ios_per_sec": 0, 00:14:19.442 "rw_mbytes_per_sec": 0, 00:14:19.442 "r_mbytes_per_sec": 0, 00:14:19.442 "w_mbytes_per_sec": 0 00:14:19.442 }, 00:14:19.442 "claimed": true, 00:14:19.442 "claim_type": "exclusive_write", 00:14:19.442 "zoned": false, 00:14:19.442 "supported_io_types": { 00:14:19.442 "read": true, 00:14:19.442 "write": true, 00:14:19.442 "unmap": true, 00:14:19.442 "flush": true, 00:14:19.442 "reset": true, 00:14:19.442 "nvme_admin": false, 00:14:19.442 "nvme_io": false, 00:14:19.442 "nvme_io_md": false, 00:14:19.442 "write_zeroes": true, 00:14:19.442 "zcopy": true, 00:14:19.442 "get_zone_info": false, 00:14:19.442 "zone_management": false, 00:14:19.442 "zone_append": false, 00:14:19.442 "compare": false, 00:14:19.442 "compare_and_write": false, 00:14:19.442 "abort": true, 00:14:19.442 "seek_hole": false, 00:14:19.442 "seek_data": false, 00:14:19.442 "copy": true, 00:14:19.442 "nvme_iov_md": false 00:14:19.442 }, 00:14:19.442 "memory_domains": [ 00:14:19.442 { 00:14:19.442 "dma_device_id": "system", 00:14:19.442 "dma_device_type": 1 00:14:19.442 }, 00:14:19.442 { 00:14:19.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.442 "dma_device_type": 2 00:14:19.442 } 00:14:19.442 ], 00:14:19.442 "driver_specific": {} 00:14:19.442 }' 00:14:19.442 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.442 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.442 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:19.442 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.442 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.700 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:19.960 [2024-07-15 22:43:04.711010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:19.960 [2024-07-15 22:43:04.711036] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:19.960 [2024-07-15 22:43:04.711075] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.960 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.218 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.218 "name": "Existed_Raid", 00:14:20.218 "uuid": "618e58a6-99b5-4639-a350-e323775bf0c2", 00:14:20.218 "strip_size_kb": 64, 00:14:20.218 "state": "offline", 00:14:20.218 "raid_level": "raid0", 00:14:20.218 "superblock": true, 00:14:20.218 "num_base_bdevs": 3, 00:14:20.218 "num_base_bdevs_discovered": 2, 00:14:20.218 "num_base_bdevs_operational": 2, 00:14:20.218 "base_bdevs_list": [ 00:14:20.218 { 00:14:20.218 "name": null, 00:14:20.218 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.218 "is_configured": false, 00:14:20.218 "data_offset": 2048, 00:14:20.218 "data_size": 63488 00:14:20.218 }, 00:14:20.218 { 00:14:20.218 "name": "BaseBdev2", 00:14:20.218 "uuid": "2beb6066-7de2-4b16-b027-1cf9b706668f", 00:14:20.218 "is_configured": true, 00:14:20.218 "data_offset": 2048, 00:14:20.218 "data_size": 63488 00:14:20.218 }, 00:14:20.218 { 00:14:20.218 "name": "BaseBdev3", 00:14:20.218 "uuid": "4947be0c-249b-4fb1-b8b7-70240185f80d", 00:14:20.218 "is_configured": true, 00:14:20.218 "data_offset": 2048, 00:14:20.218 "data_size": 63488 00:14:20.218 } 00:14:20.218 ] 00:14:20.218 }' 00:14:20.218 22:43:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.218 22:43:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:21.152 22:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:21.152 22:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:21.152 22:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.152 22:43:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:21.410 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:21.410 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:21.410 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:21.668 [2024-07-15 22:43:06.332319] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:21.668 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:21.668 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:21.668 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.668 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:22.071 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:22.071 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:22.071 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:22.071 [2024-07-15 22:43:06.848394] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:22.071 [2024-07-15 22:43:06.848435] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23f3400 name Existed_Raid, state offline 00:14:22.071 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:22.071 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:22.071 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.071 22:43:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:22.329 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:22.329 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:22.329 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:22.329 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:22.329 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:22.329 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:22.587 BaseBdev2 00:14:22.587 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:22.587 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:22.587 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:22.587 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:22.587 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:22.587 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:22.587 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:22.845 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:23.103 [ 00:14:23.103 { 00:14:23.103 "name": "BaseBdev2", 00:14:23.103 "aliases": [ 00:14:23.103 "496c9493-f619-4841-bded-f31a46c0faba" 00:14:23.103 ], 00:14:23.103 "product_name": "Malloc disk", 00:14:23.103 "block_size": 512, 00:14:23.103 "num_blocks": 65536, 00:14:23.103 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:23.103 "assigned_rate_limits": { 00:14:23.103 "rw_ios_per_sec": 0, 00:14:23.103 "rw_mbytes_per_sec": 0, 00:14:23.103 "r_mbytes_per_sec": 0, 00:14:23.103 "w_mbytes_per_sec": 0 00:14:23.103 }, 00:14:23.103 "claimed": false, 00:14:23.103 "zoned": false, 00:14:23.103 "supported_io_types": { 00:14:23.103 "read": true, 00:14:23.103 "write": true, 00:14:23.103 "unmap": true, 00:14:23.103 "flush": true, 00:14:23.103 "reset": true, 00:14:23.103 "nvme_admin": false, 00:14:23.103 "nvme_io": false, 00:14:23.103 "nvme_io_md": false, 00:14:23.103 "write_zeroes": true, 00:14:23.103 "zcopy": true, 00:14:23.103 "get_zone_info": false, 00:14:23.103 "zone_management": false, 00:14:23.103 "zone_append": false, 00:14:23.103 "compare": false, 00:14:23.103 "compare_and_write": false, 00:14:23.103 "abort": true, 00:14:23.103 "seek_hole": false, 00:14:23.103 "seek_data": false, 00:14:23.103 "copy": true, 00:14:23.103 "nvme_iov_md": false 00:14:23.103 }, 00:14:23.103 "memory_domains": [ 00:14:23.103 { 00:14:23.103 "dma_device_id": "system", 00:14:23.103 "dma_device_type": 1 00:14:23.103 }, 00:14:23.103 { 00:14:23.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.104 "dma_device_type": 2 00:14:23.104 } 00:14:23.104 ], 00:14:23.104 "driver_specific": {} 00:14:23.104 } 00:14:23.104 ] 00:14:23.104 22:43:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:23.104 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:23.104 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:23.104 22:43:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:23.362 BaseBdev3 00:14:23.362 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:23.362 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:23.362 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:23.362 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:23.362 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:23.362 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:23.362 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:23.620 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:23.620 [ 00:14:23.620 { 00:14:23.620 "name": "BaseBdev3", 00:14:23.620 "aliases": [ 00:14:23.620 "192a003d-7679-4529-9302-f03dffc09730" 00:14:23.620 ], 00:14:23.620 "product_name": "Malloc disk", 00:14:23.620 "block_size": 512, 00:14:23.620 "num_blocks": 65536, 00:14:23.620 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:23.620 "assigned_rate_limits": { 00:14:23.620 "rw_ios_per_sec": 0, 00:14:23.620 "rw_mbytes_per_sec": 0, 00:14:23.620 "r_mbytes_per_sec": 0, 00:14:23.620 "w_mbytes_per_sec": 0 00:14:23.620 }, 00:14:23.620 "claimed": false, 00:14:23.620 "zoned": false, 00:14:23.620 "supported_io_types": { 00:14:23.620 "read": true, 00:14:23.620 "write": true, 00:14:23.620 "unmap": true, 00:14:23.620 "flush": true, 00:14:23.620 "reset": true, 00:14:23.620 "nvme_admin": false, 00:14:23.620 "nvme_io": false, 00:14:23.620 "nvme_io_md": false, 00:14:23.620 "write_zeroes": true, 00:14:23.620 "zcopy": true, 00:14:23.620 "get_zone_info": false, 00:14:23.620 "zone_management": false, 00:14:23.620 "zone_append": false, 00:14:23.620 "compare": false, 00:14:23.620 "compare_and_write": false, 00:14:23.620 "abort": true, 00:14:23.620 "seek_hole": false, 00:14:23.620 "seek_data": false, 00:14:23.620 "copy": true, 00:14:23.620 "nvme_iov_md": false 00:14:23.620 }, 00:14:23.620 "memory_domains": [ 00:14:23.620 { 00:14:23.620 "dma_device_id": "system", 00:14:23.620 "dma_device_type": 1 00:14:23.621 }, 00:14:23.621 { 00:14:23.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.621 "dma_device_type": 2 00:14:23.621 } 00:14:23.621 ], 00:14:23.621 "driver_specific": {} 00:14:23.621 } 00:14:23.621 ] 00:14:23.878 22:43:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:23.878 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:23.878 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:23.878 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:23.878 [2024-07-15 22:43:08.762086] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:23.878 [2024-07-15 22:43:08.762128] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:23.878 [2024-07-15 22:43:08.762147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:23.878 [2024-07-15 22:43:08.763509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:23.878 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:23.878 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.879 22:43:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.137 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.137 "name": "Existed_Raid", 00:14:24.137 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:24.137 "strip_size_kb": 64, 00:14:24.137 "state": "configuring", 00:14:24.137 "raid_level": "raid0", 00:14:24.137 "superblock": true, 00:14:24.137 "num_base_bdevs": 3, 00:14:24.137 "num_base_bdevs_discovered": 2, 00:14:24.137 "num_base_bdevs_operational": 3, 00:14:24.137 "base_bdevs_list": [ 00:14:24.137 { 00:14:24.137 "name": "BaseBdev1", 00:14:24.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.137 "is_configured": false, 00:14:24.137 "data_offset": 0, 00:14:24.137 "data_size": 0 00:14:24.137 }, 00:14:24.137 { 00:14:24.137 "name": "BaseBdev2", 00:14:24.137 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:24.137 "is_configured": true, 00:14:24.137 "data_offset": 2048, 00:14:24.137 "data_size": 63488 00:14:24.137 }, 00:14:24.137 { 00:14:24.137 "name": "BaseBdev3", 00:14:24.137 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:24.137 "is_configured": true, 00:14:24.137 "data_offset": 2048, 00:14:24.137 "data_size": 63488 00:14:24.137 } 00:14:24.137 ] 00:14:24.137 }' 00:14:24.137 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.137 22:43:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:25.093 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:25.093 [2024-07-15 22:43:09.852948] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:25.093 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:25.093 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.093 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:25.093 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:25.093 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.093 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.094 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.094 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.094 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.094 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.094 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.094 22:43:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.352 22:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.352 "name": "Existed_Raid", 00:14:25.352 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:25.352 "strip_size_kb": 64, 00:14:25.352 "state": "configuring", 00:14:25.352 "raid_level": "raid0", 00:14:25.352 "superblock": true, 00:14:25.352 "num_base_bdevs": 3, 00:14:25.352 "num_base_bdevs_discovered": 1, 00:14:25.352 "num_base_bdevs_operational": 3, 00:14:25.352 "base_bdevs_list": [ 00:14:25.352 { 00:14:25.352 "name": "BaseBdev1", 00:14:25.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.352 "is_configured": false, 00:14:25.352 "data_offset": 0, 00:14:25.352 "data_size": 0 00:14:25.352 }, 00:14:25.352 { 00:14:25.352 "name": null, 00:14:25.352 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:25.352 "is_configured": false, 00:14:25.352 "data_offset": 2048, 00:14:25.352 "data_size": 63488 00:14:25.352 }, 00:14:25.352 { 00:14:25.352 "name": "BaseBdev3", 00:14:25.352 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:25.352 "is_configured": true, 00:14:25.352 "data_offset": 2048, 00:14:25.352 "data_size": 63488 00:14:25.352 } 00:14:25.352 ] 00:14:25.352 }' 00:14:25.352 22:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.352 22:43:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:25.920 22:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.920 22:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:26.179 22:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:26.179 22:43:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:26.438 [2024-07-15 22:43:11.205124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:26.438 BaseBdev1 00:14:26.438 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:26.438 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:26.438 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:26.438 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:26.438 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:26.438 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:26.438 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:26.697 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:26.956 [ 00:14:26.956 { 00:14:26.956 "name": "BaseBdev1", 00:14:26.956 "aliases": [ 00:14:26.956 "f5dcdcb1-1540-49e5-88f4-8b2a360b5066" 00:14:26.956 ], 00:14:26.956 "product_name": "Malloc disk", 00:14:26.956 "block_size": 512, 00:14:26.956 "num_blocks": 65536, 00:14:26.956 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:26.956 "assigned_rate_limits": { 00:14:26.956 "rw_ios_per_sec": 0, 00:14:26.956 "rw_mbytes_per_sec": 0, 00:14:26.956 "r_mbytes_per_sec": 0, 00:14:26.956 "w_mbytes_per_sec": 0 00:14:26.956 }, 00:14:26.956 "claimed": true, 00:14:26.956 "claim_type": "exclusive_write", 00:14:26.956 "zoned": false, 00:14:26.956 "supported_io_types": { 00:14:26.956 "read": true, 00:14:26.956 "write": true, 00:14:26.956 "unmap": true, 00:14:26.956 "flush": true, 00:14:26.956 "reset": true, 00:14:26.956 "nvme_admin": false, 00:14:26.956 "nvme_io": false, 00:14:26.956 "nvme_io_md": false, 00:14:26.956 "write_zeroes": true, 00:14:26.956 "zcopy": true, 00:14:26.956 "get_zone_info": false, 00:14:26.956 "zone_management": false, 00:14:26.956 "zone_append": false, 00:14:26.956 "compare": false, 00:14:26.956 "compare_and_write": false, 00:14:26.956 "abort": true, 00:14:26.956 "seek_hole": false, 00:14:26.956 "seek_data": false, 00:14:26.956 "copy": true, 00:14:26.956 "nvme_iov_md": false 00:14:26.956 }, 00:14:26.956 "memory_domains": [ 00:14:26.956 { 00:14:26.956 "dma_device_id": "system", 00:14:26.956 "dma_device_type": 1 00:14:26.956 }, 00:14:26.956 { 00:14:26.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.956 "dma_device_type": 2 00:14:26.956 } 00:14:26.956 ], 00:14:26.956 "driver_specific": {} 00:14:26.956 } 00:14:26.956 ] 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.956 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.215 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.215 "name": "Existed_Raid", 00:14:27.215 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:27.215 "strip_size_kb": 64, 00:14:27.215 "state": "configuring", 00:14:27.215 "raid_level": "raid0", 00:14:27.215 "superblock": true, 00:14:27.215 "num_base_bdevs": 3, 00:14:27.215 "num_base_bdevs_discovered": 2, 00:14:27.215 "num_base_bdevs_operational": 3, 00:14:27.215 "base_bdevs_list": [ 00:14:27.215 { 00:14:27.215 "name": "BaseBdev1", 00:14:27.215 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:27.215 "is_configured": true, 00:14:27.215 "data_offset": 2048, 00:14:27.215 "data_size": 63488 00:14:27.215 }, 00:14:27.215 { 00:14:27.215 "name": null, 00:14:27.215 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:27.215 "is_configured": false, 00:14:27.215 "data_offset": 2048, 00:14:27.215 "data_size": 63488 00:14:27.215 }, 00:14:27.215 { 00:14:27.215 "name": "BaseBdev3", 00:14:27.215 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:27.215 "is_configured": true, 00:14:27.215 "data_offset": 2048, 00:14:27.215 "data_size": 63488 00:14:27.215 } 00:14:27.215 ] 00:14:27.215 }' 00:14:27.215 22:43:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.215 22:43:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:27.784 22:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.785 22:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:28.044 22:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:28.044 22:43:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:28.304 [2024-07-15 22:43:13.054071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.304 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.562 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.562 "name": "Existed_Raid", 00:14:28.562 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:28.562 "strip_size_kb": 64, 00:14:28.562 "state": "configuring", 00:14:28.562 "raid_level": "raid0", 00:14:28.562 "superblock": true, 00:14:28.562 "num_base_bdevs": 3, 00:14:28.562 "num_base_bdevs_discovered": 1, 00:14:28.562 "num_base_bdevs_operational": 3, 00:14:28.562 "base_bdevs_list": [ 00:14:28.562 { 00:14:28.562 "name": "BaseBdev1", 00:14:28.562 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:28.562 "is_configured": true, 00:14:28.562 "data_offset": 2048, 00:14:28.562 "data_size": 63488 00:14:28.562 }, 00:14:28.562 { 00:14:28.562 "name": null, 00:14:28.562 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:28.562 "is_configured": false, 00:14:28.562 "data_offset": 2048, 00:14:28.562 "data_size": 63488 00:14:28.562 }, 00:14:28.562 { 00:14:28.562 "name": null, 00:14:28.562 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:28.562 "is_configured": false, 00:14:28.562 "data_offset": 2048, 00:14:28.562 "data_size": 63488 00:14:28.562 } 00:14:28.562 ] 00:14:28.562 }' 00:14:28.562 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.562 22:43:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:29.132 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:29.132 22:43:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.401 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:29.401 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:29.660 [2024-07-15 22:43:14.417693] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.660 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:29.919 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.919 "name": "Existed_Raid", 00:14:29.919 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:29.919 "strip_size_kb": 64, 00:14:29.919 "state": "configuring", 00:14:29.919 "raid_level": "raid0", 00:14:29.919 "superblock": true, 00:14:29.919 "num_base_bdevs": 3, 00:14:29.919 "num_base_bdevs_discovered": 2, 00:14:29.919 "num_base_bdevs_operational": 3, 00:14:29.919 "base_bdevs_list": [ 00:14:29.919 { 00:14:29.919 "name": "BaseBdev1", 00:14:29.919 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:29.919 "is_configured": true, 00:14:29.919 "data_offset": 2048, 00:14:29.919 "data_size": 63488 00:14:29.919 }, 00:14:29.919 { 00:14:29.919 "name": null, 00:14:29.919 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:29.919 "is_configured": false, 00:14:29.919 "data_offset": 2048, 00:14:29.919 "data_size": 63488 00:14:29.919 }, 00:14:29.919 { 00:14:29.919 "name": "BaseBdev3", 00:14:29.919 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:29.919 "is_configured": true, 00:14:29.919 "data_offset": 2048, 00:14:29.919 "data_size": 63488 00:14:29.919 } 00:14:29.919 ] 00:14:29.919 }' 00:14:29.919 22:43:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.919 22:43:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:30.487 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.487 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:30.746 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:30.746 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:31.005 [2024-07-15 22:43:15.777343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.005 22:43:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.264 22:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.264 "name": "Existed_Raid", 00:14:31.264 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:31.264 "strip_size_kb": 64, 00:14:31.264 "state": "configuring", 00:14:31.264 "raid_level": "raid0", 00:14:31.264 "superblock": true, 00:14:31.264 "num_base_bdevs": 3, 00:14:31.264 "num_base_bdevs_discovered": 1, 00:14:31.264 "num_base_bdevs_operational": 3, 00:14:31.264 "base_bdevs_list": [ 00:14:31.264 { 00:14:31.264 "name": null, 00:14:31.264 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:31.264 "is_configured": false, 00:14:31.265 "data_offset": 2048, 00:14:31.265 "data_size": 63488 00:14:31.265 }, 00:14:31.265 { 00:14:31.265 "name": null, 00:14:31.265 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:31.265 "is_configured": false, 00:14:31.265 "data_offset": 2048, 00:14:31.265 "data_size": 63488 00:14:31.265 }, 00:14:31.265 { 00:14:31.265 "name": "BaseBdev3", 00:14:31.265 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:31.265 "is_configured": true, 00:14:31.265 "data_offset": 2048, 00:14:31.265 "data_size": 63488 00:14:31.265 } 00:14:31.265 ] 00:14:31.265 }' 00:14:31.265 22:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.265 22:43:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.834 22:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.834 22:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:32.093 22:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:32.093 22:43:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:32.353 [2024-07-15 22:43:17.151629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.353 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.612 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.612 "name": "Existed_Raid", 00:14:32.612 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:32.612 "strip_size_kb": 64, 00:14:32.612 "state": "configuring", 00:14:32.612 "raid_level": "raid0", 00:14:32.612 "superblock": true, 00:14:32.612 "num_base_bdevs": 3, 00:14:32.612 "num_base_bdevs_discovered": 2, 00:14:32.612 "num_base_bdevs_operational": 3, 00:14:32.612 "base_bdevs_list": [ 00:14:32.612 { 00:14:32.612 "name": null, 00:14:32.612 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:32.612 "is_configured": false, 00:14:32.612 "data_offset": 2048, 00:14:32.612 "data_size": 63488 00:14:32.612 }, 00:14:32.612 { 00:14:32.612 "name": "BaseBdev2", 00:14:32.612 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:32.612 "is_configured": true, 00:14:32.612 "data_offset": 2048, 00:14:32.612 "data_size": 63488 00:14:32.612 }, 00:14:32.612 { 00:14:32.612 "name": "BaseBdev3", 00:14:32.612 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:32.612 "is_configured": true, 00:14:32.612 "data_offset": 2048, 00:14:32.612 "data_size": 63488 00:14:32.612 } 00:14:32.612 ] 00:14:32.612 }' 00:14:32.612 22:43:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.612 22:43:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:33.180 22:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.180 22:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:33.439 22:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:33.439 22:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.439 22:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:33.698 22:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f5dcdcb1-1540-49e5-88f4-8b2a360b5066 00:14:33.957 [2024-07-15 22:43:18.664186] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:33.957 [2024-07-15 22:43:18.664343] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23f1e90 00:14:33.957 [2024-07-15 22:43:18.664356] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:33.957 [2024-07-15 22:43:18.664529] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f8940 00:14:33.957 [2024-07-15 22:43:18.664641] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23f1e90 00:14:33.957 [2024-07-15 22:43:18.664651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23f1e90 00:14:33.957 [2024-07-15 22:43:18.664739] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:33.957 NewBaseBdev 00:14:33.957 22:43:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:33.957 22:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:33.957 22:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:33.957 22:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:33.957 22:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:33.957 22:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:33.957 22:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:34.216 22:43:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:34.476 [ 00:14:34.476 { 00:14:34.476 "name": "NewBaseBdev", 00:14:34.476 "aliases": [ 00:14:34.476 "f5dcdcb1-1540-49e5-88f4-8b2a360b5066" 00:14:34.476 ], 00:14:34.476 "product_name": "Malloc disk", 00:14:34.476 "block_size": 512, 00:14:34.476 "num_blocks": 65536, 00:14:34.476 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:34.476 "assigned_rate_limits": { 00:14:34.476 "rw_ios_per_sec": 0, 00:14:34.476 "rw_mbytes_per_sec": 0, 00:14:34.476 "r_mbytes_per_sec": 0, 00:14:34.476 "w_mbytes_per_sec": 0 00:14:34.476 }, 00:14:34.476 "claimed": true, 00:14:34.476 "claim_type": "exclusive_write", 00:14:34.476 "zoned": false, 00:14:34.476 "supported_io_types": { 00:14:34.476 "read": true, 00:14:34.476 "write": true, 00:14:34.476 "unmap": true, 00:14:34.476 "flush": true, 00:14:34.476 "reset": true, 00:14:34.476 "nvme_admin": false, 00:14:34.476 "nvme_io": false, 00:14:34.476 "nvme_io_md": false, 00:14:34.476 "write_zeroes": true, 00:14:34.476 "zcopy": true, 00:14:34.476 "get_zone_info": false, 00:14:34.476 "zone_management": false, 00:14:34.476 "zone_append": false, 00:14:34.476 "compare": false, 00:14:34.476 "compare_and_write": false, 00:14:34.476 "abort": true, 00:14:34.476 "seek_hole": false, 00:14:34.476 "seek_data": false, 00:14:34.476 "copy": true, 00:14:34.476 "nvme_iov_md": false 00:14:34.476 }, 00:14:34.476 "memory_domains": [ 00:14:34.476 { 00:14:34.476 "dma_device_id": "system", 00:14:34.476 "dma_device_type": 1 00:14:34.476 }, 00:14:34.476 { 00:14:34.476 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.476 "dma_device_type": 2 00:14:34.476 } 00:14:34.476 ], 00:14:34.476 "driver_specific": {} 00:14:34.476 } 00:14:34.476 ] 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.476 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.736 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.736 "name": "Existed_Raid", 00:14:34.736 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:34.736 "strip_size_kb": 64, 00:14:34.736 "state": "online", 00:14:34.736 "raid_level": "raid0", 00:14:34.736 "superblock": true, 00:14:34.736 "num_base_bdevs": 3, 00:14:34.736 "num_base_bdevs_discovered": 3, 00:14:34.736 "num_base_bdevs_operational": 3, 00:14:34.736 "base_bdevs_list": [ 00:14:34.736 { 00:14:34.736 "name": "NewBaseBdev", 00:14:34.736 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:34.736 "is_configured": true, 00:14:34.736 "data_offset": 2048, 00:14:34.736 "data_size": 63488 00:14:34.736 }, 00:14:34.736 { 00:14:34.736 "name": "BaseBdev2", 00:14:34.736 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:34.736 "is_configured": true, 00:14:34.736 "data_offset": 2048, 00:14:34.736 "data_size": 63488 00:14:34.736 }, 00:14:34.736 { 00:14:34.736 "name": "BaseBdev3", 00:14:34.736 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:34.736 "is_configured": true, 00:14:34.736 "data_offset": 2048, 00:14:34.736 "data_size": 63488 00:14:34.736 } 00:14:34.736 ] 00:14:34.736 }' 00:14:34.736 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.736 22:43:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:35.305 22:43:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:35.565 [2024-07-15 22:43:20.216632] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:35.565 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:35.565 "name": "Existed_Raid", 00:14:35.565 "aliases": [ 00:14:35.565 "7e15fb81-7956-4f93-97df-8726a6cc5683" 00:14:35.565 ], 00:14:35.565 "product_name": "Raid Volume", 00:14:35.565 "block_size": 512, 00:14:35.565 "num_blocks": 190464, 00:14:35.565 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:35.565 "assigned_rate_limits": { 00:14:35.565 "rw_ios_per_sec": 0, 00:14:35.565 "rw_mbytes_per_sec": 0, 00:14:35.565 "r_mbytes_per_sec": 0, 00:14:35.565 "w_mbytes_per_sec": 0 00:14:35.565 }, 00:14:35.565 "claimed": false, 00:14:35.565 "zoned": false, 00:14:35.565 "supported_io_types": { 00:14:35.565 "read": true, 00:14:35.565 "write": true, 00:14:35.565 "unmap": true, 00:14:35.565 "flush": true, 00:14:35.565 "reset": true, 00:14:35.565 "nvme_admin": false, 00:14:35.565 "nvme_io": false, 00:14:35.565 "nvme_io_md": false, 00:14:35.565 "write_zeroes": true, 00:14:35.565 "zcopy": false, 00:14:35.565 "get_zone_info": false, 00:14:35.565 "zone_management": false, 00:14:35.565 "zone_append": false, 00:14:35.565 "compare": false, 00:14:35.565 "compare_and_write": false, 00:14:35.565 "abort": false, 00:14:35.565 "seek_hole": false, 00:14:35.565 "seek_data": false, 00:14:35.565 "copy": false, 00:14:35.565 "nvme_iov_md": false 00:14:35.565 }, 00:14:35.565 "memory_domains": [ 00:14:35.565 { 00:14:35.565 "dma_device_id": "system", 00:14:35.565 "dma_device_type": 1 00:14:35.565 }, 00:14:35.565 { 00:14:35.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.565 "dma_device_type": 2 00:14:35.565 }, 00:14:35.565 { 00:14:35.565 "dma_device_id": "system", 00:14:35.565 "dma_device_type": 1 00:14:35.565 }, 00:14:35.565 { 00:14:35.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.565 "dma_device_type": 2 00:14:35.565 }, 00:14:35.565 { 00:14:35.565 "dma_device_id": "system", 00:14:35.565 "dma_device_type": 1 00:14:35.565 }, 00:14:35.565 { 00:14:35.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.565 "dma_device_type": 2 00:14:35.565 } 00:14:35.565 ], 00:14:35.565 "driver_specific": { 00:14:35.565 "raid": { 00:14:35.565 "uuid": "7e15fb81-7956-4f93-97df-8726a6cc5683", 00:14:35.565 "strip_size_kb": 64, 00:14:35.565 "state": "online", 00:14:35.565 "raid_level": "raid0", 00:14:35.565 "superblock": true, 00:14:35.565 "num_base_bdevs": 3, 00:14:35.565 "num_base_bdevs_discovered": 3, 00:14:35.565 "num_base_bdevs_operational": 3, 00:14:35.565 "base_bdevs_list": [ 00:14:35.565 { 00:14:35.565 "name": "NewBaseBdev", 00:14:35.565 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:35.565 "is_configured": true, 00:14:35.565 "data_offset": 2048, 00:14:35.565 "data_size": 63488 00:14:35.565 }, 00:14:35.565 { 00:14:35.565 "name": "BaseBdev2", 00:14:35.565 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:35.565 "is_configured": true, 00:14:35.565 "data_offset": 2048, 00:14:35.565 "data_size": 63488 00:14:35.565 }, 00:14:35.565 { 00:14:35.565 "name": "BaseBdev3", 00:14:35.565 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:35.565 "is_configured": true, 00:14:35.565 "data_offset": 2048, 00:14:35.565 "data_size": 63488 00:14:35.565 } 00:14:35.565 ] 00:14:35.565 } 00:14:35.565 } 00:14:35.565 }' 00:14:35.565 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:35.565 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:35.565 BaseBdev2 00:14:35.565 BaseBdev3' 00:14:35.565 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:35.565 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:35.565 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:35.825 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:35.825 "name": "NewBaseBdev", 00:14:35.825 "aliases": [ 00:14:35.825 "f5dcdcb1-1540-49e5-88f4-8b2a360b5066" 00:14:35.825 ], 00:14:35.825 "product_name": "Malloc disk", 00:14:35.825 "block_size": 512, 00:14:35.825 "num_blocks": 65536, 00:14:35.825 "uuid": "f5dcdcb1-1540-49e5-88f4-8b2a360b5066", 00:14:35.825 "assigned_rate_limits": { 00:14:35.825 "rw_ios_per_sec": 0, 00:14:35.825 "rw_mbytes_per_sec": 0, 00:14:35.825 "r_mbytes_per_sec": 0, 00:14:35.825 "w_mbytes_per_sec": 0 00:14:35.825 }, 00:14:35.825 "claimed": true, 00:14:35.825 "claim_type": "exclusive_write", 00:14:35.825 "zoned": false, 00:14:35.825 "supported_io_types": { 00:14:35.825 "read": true, 00:14:35.825 "write": true, 00:14:35.825 "unmap": true, 00:14:35.825 "flush": true, 00:14:35.825 "reset": true, 00:14:35.825 "nvme_admin": false, 00:14:35.825 "nvme_io": false, 00:14:35.825 "nvme_io_md": false, 00:14:35.825 "write_zeroes": true, 00:14:35.825 "zcopy": true, 00:14:35.825 "get_zone_info": false, 00:14:35.825 "zone_management": false, 00:14:35.825 "zone_append": false, 00:14:35.825 "compare": false, 00:14:35.825 "compare_and_write": false, 00:14:35.825 "abort": true, 00:14:35.825 "seek_hole": false, 00:14:35.825 "seek_data": false, 00:14:35.825 "copy": true, 00:14:35.825 "nvme_iov_md": false 00:14:35.825 }, 00:14:35.825 "memory_domains": [ 00:14:35.825 { 00:14:35.825 "dma_device_id": "system", 00:14:35.825 "dma_device_type": 1 00:14:35.825 }, 00:14:35.825 { 00:14:35.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.825 "dma_device_type": 2 00:14:35.825 } 00:14:35.825 ], 00:14:35.825 "driver_specific": {} 00:14:35.825 }' 00:14:35.825 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.825 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.825 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:35.825 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:35.825 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:36.085 22:43:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:36.344 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:36.344 "name": "BaseBdev2", 00:14:36.344 "aliases": [ 00:14:36.344 "496c9493-f619-4841-bded-f31a46c0faba" 00:14:36.344 ], 00:14:36.344 "product_name": "Malloc disk", 00:14:36.344 "block_size": 512, 00:14:36.344 "num_blocks": 65536, 00:14:36.344 "uuid": "496c9493-f619-4841-bded-f31a46c0faba", 00:14:36.344 "assigned_rate_limits": { 00:14:36.344 "rw_ios_per_sec": 0, 00:14:36.344 "rw_mbytes_per_sec": 0, 00:14:36.344 "r_mbytes_per_sec": 0, 00:14:36.344 "w_mbytes_per_sec": 0 00:14:36.344 }, 00:14:36.344 "claimed": true, 00:14:36.344 "claim_type": "exclusive_write", 00:14:36.344 "zoned": false, 00:14:36.344 "supported_io_types": { 00:14:36.344 "read": true, 00:14:36.344 "write": true, 00:14:36.344 "unmap": true, 00:14:36.344 "flush": true, 00:14:36.344 "reset": true, 00:14:36.344 "nvme_admin": false, 00:14:36.344 "nvme_io": false, 00:14:36.344 "nvme_io_md": false, 00:14:36.344 "write_zeroes": true, 00:14:36.344 "zcopy": true, 00:14:36.344 "get_zone_info": false, 00:14:36.344 "zone_management": false, 00:14:36.344 "zone_append": false, 00:14:36.344 "compare": false, 00:14:36.344 "compare_and_write": false, 00:14:36.344 "abort": true, 00:14:36.344 "seek_hole": false, 00:14:36.344 "seek_data": false, 00:14:36.344 "copy": true, 00:14:36.344 "nvme_iov_md": false 00:14:36.344 }, 00:14:36.344 "memory_domains": [ 00:14:36.344 { 00:14:36.344 "dma_device_id": "system", 00:14:36.344 "dma_device_type": 1 00:14:36.344 }, 00:14:36.344 { 00:14:36.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.344 "dma_device_type": 2 00:14:36.344 } 00:14:36.344 ], 00:14:36.344 "driver_specific": {} 00:14:36.344 }' 00:14:36.344 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:36.344 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:36.603 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:36.603 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:36.603 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:36.603 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:36.603 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:36.603 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:36.861 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:36.861 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.861 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.861 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:36.861 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:36.861 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:36.861 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:37.121 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:37.121 "name": "BaseBdev3", 00:14:37.121 "aliases": [ 00:14:37.121 "192a003d-7679-4529-9302-f03dffc09730" 00:14:37.121 ], 00:14:37.121 "product_name": "Malloc disk", 00:14:37.121 "block_size": 512, 00:14:37.121 "num_blocks": 65536, 00:14:37.121 "uuid": "192a003d-7679-4529-9302-f03dffc09730", 00:14:37.121 "assigned_rate_limits": { 00:14:37.121 "rw_ios_per_sec": 0, 00:14:37.121 "rw_mbytes_per_sec": 0, 00:14:37.121 "r_mbytes_per_sec": 0, 00:14:37.121 "w_mbytes_per_sec": 0 00:14:37.121 }, 00:14:37.121 "claimed": true, 00:14:37.121 "claim_type": "exclusive_write", 00:14:37.121 "zoned": false, 00:14:37.121 "supported_io_types": { 00:14:37.121 "read": true, 00:14:37.121 "write": true, 00:14:37.121 "unmap": true, 00:14:37.121 "flush": true, 00:14:37.121 "reset": true, 00:14:37.121 "nvme_admin": false, 00:14:37.121 "nvme_io": false, 00:14:37.121 "nvme_io_md": false, 00:14:37.121 "write_zeroes": true, 00:14:37.121 "zcopy": true, 00:14:37.121 "get_zone_info": false, 00:14:37.121 "zone_management": false, 00:14:37.121 "zone_append": false, 00:14:37.121 "compare": false, 00:14:37.121 "compare_and_write": false, 00:14:37.121 "abort": true, 00:14:37.121 "seek_hole": false, 00:14:37.121 "seek_data": false, 00:14:37.121 "copy": true, 00:14:37.121 "nvme_iov_md": false 00:14:37.121 }, 00:14:37.121 "memory_domains": [ 00:14:37.121 { 00:14:37.122 "dma_device_id": "system", 00:14:37.122 "dma_device_type": 1 00:14:37.122 }, 00:14:37.122 { 00:14:37.122 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:37.122 "dma_device_type": 2 00:14:37.122 } 00:14:37.122 ], 00:14:37.122 "driver_specific": {} 00:14:37.122 }' 00:14:37.122 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.122 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.122 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:37.122 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.122 22:43:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.379 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:37.379 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.379 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.379 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:37.379 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.379 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.638 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:37.638 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:37.638 [2024-07-15 22:43:22.526489] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:37.638 [2024-07-15 22:43:22.526515] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:37.638 [2024-07-15 22:43:22.526571] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:37.638 [2024-07-15 22:43:22.526623] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:37.638 [2024-07-15 22:43:22.526635] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23f1e90 name Existed_Raid, state offline 00:14:37.638 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2723670 00:14:37.638 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2723670 ']' 00:14:37.638 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2723670 00:14:37.896 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:37.897 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:37.897 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2723670 00:14:37.897 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:37.897 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:37.897 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2723670' 00:14:37.897 killing process with pid 2723670 00:14:37.897 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2723670 00:14:37.897 [2024-07-15 22:43:22.610963] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:37.897 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2723670 00:14:37.897 [2024-07-15 22:43:22.637447] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:38.155 22:43:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:38.155 00:14:38.155 real 0m29.173s 00:14:38.155 user 0m53.504s 00:14:38.155 sys 0m5.226s 00:14:38.155 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:38.155 22:43:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:38.155 ************************************ 00:14:38.155 END TEST raid_state_function_test_sb 00:14:38.155 ************************************ 00:14:38.155 22:43:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:38.155 22:43:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:38.155 22:43:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:38.155 22:43:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:38.155 22:43:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:38.155 ************************************ 00:14:38.155 START TEST raid_superblock_test 00:14:38.155 ************************************ 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2728013 00:14:38.155 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:38.156 22:43:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2728013 /var/tmp/spdk-raid.sock 00:14:38.156 22:43:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2728013 ']' 00:14:38.156 22:43:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:38.156 22:43:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:38.156 22:43:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:38.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:38.156 22:43:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:38.156 22:43:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.156 [2024-07-15 22:43:22.980714] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:14:38.156 [2024-07-15 22:43:22.980779] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2728013 ] 00:14:38.414 [2024-07-15 22:43:23.111497] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:38.414 [2024-07-15 22:43:23.217231] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:38.414 [2024-07-15 22:43:23.278313] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:38.414 [2024-07-15 22:43:23.278345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:39.348 22:43:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:39.348 malloc1 00:14:39.348 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:39.606 [2024-07-15 22:43:24.379440] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:39.606 [2024-07-15 22:43:24.379485] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:39.606 [2024-07-15 22:43:24.379507] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x859570 00:14:39.606 [2024-07-15 22:43:24.379519] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:39.606 [2024-07-15 22:43:24.381186] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:39.606 [2024-07-15 22:43:24.381215] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:39.606 pt1 00:14:39.606 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:39.606 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:39.606 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:39.606 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:39.606 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:39.606 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:39.606 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:39.607 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:39.607 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:39.864 malloc2 00:14:39.864 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:40.122 [2024-07-15 22:43:24.926949] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:40.122 [2024-07-15 22:43:24.926995] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:40.122 [2024-07-15 22:43:24.927014] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x85a970 00:14:40.122 [2024-07-15 22:43:24.927026] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:40.122 [2024-07-15 22:43:24.928668] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:40.122 [2024-07-15 22:43:24.928696] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:40.122 pt2 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:40.122 22:43:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:40.380 malloc3 00:14:40.380 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:40.638 [2024-07-15 22:43:25.422099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:40.638 [2024-07-15 22:43:25.422142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:40.638 [2024-07-15 22:43:25.422160] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f1340 00:14:40.638 [2024-07-15 22:43:25.422173] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:40.638 [2024-07-15 22:43:25.423751] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:40.638 [2024-07-15 22:43:25.423780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:40.638 pt3 00:14:40.638 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:40.638 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:40.638 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:40.896 [2024-07-15 22:43:25.654735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:40.896 [2024-07-15 22:43:25.656054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:40.896 [2024-07-15 22:43:25.656110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:40.896 [2024-07-15 22:43:25.656261] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x851ea0 00:14:40.896 [2024-07-15 22:43:25.656272] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:40.896 [2024-07-15 22:43:25.656474] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x859240 00:14:40.896 [2024-07-15 22:43:25.656615] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x851ea0 00:14:40.896 [2024-07-15 22:43:25.656626] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x851ea0 00:14:40.896 [2024-07-15 22:43:25.656725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:40.896 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:40.896 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:40.896 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:40.896 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:40.896 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.896 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.896 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.897 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.897 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.897 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.897 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.897 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:41.155 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.155 "name": "raid_bdev1", 00:14:41.155 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:41.155 "strip_size_kb": 64, 00:14:41.155 "state": "online", 00:14:41.155 "raid_level": "raid0", 00:14:41.155 "superblock": true, 00:14:41.155 "num_base_bdevs": 3, 00:14:41.155 "num_base_bdevs_discovered": 3, 00:14:41.155 "num_base_bdevs_operational": 3, 00:14:41.155 "base_bdevs_list": [ 00:14:41.155 { 00:14:41.155 "name": "pt1", 00:14:41.155 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:41.155 "is_configured": true, 00:14:41.155 "data_offset": 2048, 00:14:41.155 "data_size": 63488 00:14:41.155 }, 00:14:41.155 { 00:14:41.155 "name": "pt2", 00:14:41.155 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:41.155 "is_configured": true, 00:14:41.155 "data_offset": 2048, 00:14:41.155 "data_size": 63488 00:14:41.155 }, 00:14:41.155 { 00:14:41.155 "name": "pt3", 00:14:41.155 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:41.155 "is_configured": true, 00:14:41.155 "data_offset": 2048, 00:14:41.155 "data_size": 63488 00:14:41.155 } 00:14:41.155 ] 00:14:41.155 }' 00:14:41.155 22:43:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.155 22:43:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:41.721 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:41.979 [2024-07-15 22:43:26.773933] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:41.979 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:41.979 "name": "raid_bdev1", 00:14:41.979 "aliases": [ 00:14:41.979 "d5a7afe3-5dea-481d-bc06-d890d1330ba7" 00:14:41.979 ], 00:14:41.979 "product_name": "Raid Volume", 00:14:41.979 "block_size": 512, 00:14:41.979 "num_blocks": 190464, 00:14:41.979 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:41.979 "assigned_rate_limits": { 00:14:41.979 "rw_ios_per_sec": 0, 00:14:41.979 "rw_mbytes_per_sec": 0, 00:14:41.979 "r_mbytes_per_sec": 0, 00:14:41.979 "w_mbytes_per_sec": 0 00:14:41.979 }, 00:14:41.979 "claimed": false, 00:14:41.979 "zoned": false, 00:14:41.979 "supported_io_types": { 00:14:41.979 "read": true, 00:14:41.979 "write": true, 00:14:41.979 "unmap": true, 00:14:41.979 "flush": true, 00:14:41.979 "reset": true, 00:14:41.979 "nvme_admin": false, 00:14:41.979 "nvme_io": false, 00:14:41.979 "nvme_io_md": false, 00:14:41.979 "write_zeroes": true, 00:14:41.979 "zcopy": false, 00:14:41.979 "get_zone_info": false, 00:14:41.979 "zone_management": false, 00:14:41.979 "zone_append": false, 00:14:41.979 "compare": false, 00:14:41.979 "compare_and_write": false, 00:14:41.979 "abort": false, 00:14:41.979 "seek_hole": false, 00:14:41.979 "seek_data": false, 00:14:41.979 "copy": false, 00:14:41.979 "nvme_iov_md": false 00:14:41.979 }, 00:14:41.979 "memory_domains": [ 00:14:41.979 { 00:14:41.979 "dma_device_id": "system", 00:14:41.979 "dma_device_type": 1 00:14:41.979 }, 00:14:41.979 { 00:14:41.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.979 "dma_device_type": 2 00:14:41.979 }, 00:14:41.979 { 00:14:41.979 "dma_device_id": "system", 00:14:41.979 "dma_device_type": 1 00:14:41.979 }, 00:14:41.979 { 00:14:41.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.979 "dma_device_type": 2 00:14:41.979 }, 00:14:41.979 { 00:14:41.979 "dma_device_id": "system", 00:14:41.979 "dma_device_type": 1 00:14:41.979 }, 00:14:41.979 { 00:14:41.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.979 "dma_device_type": 2 00:14:41.979 } 00:14:41.979 ], 00:14:41.979 "driver_specific": { 00:14:41.979 "raid": { 00:14:41.979 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:41.979 "strip_size_kb": 64, 00:14:41.979 "state": "online", 00:14:41.979 "raid_level": "raid0", 00:14:41.979 "superblock": true, 00:14:41.979 "num_base_bdevs": 3, 00:14:41.979 "num_base_bdevs_discovered": 3, 00:14:41.979 "num_base_bdevs_operational": 3, 00:14:41.979 "base_bdevs_list": [ 00:14:41.979 { 00:14:41.979 "name": "pt1", 00:14:41.979 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:41.979 "is_configured": true, 00:14:41.979 "data_offset": 2048, 00:14:41.979 "data_size": 63488 00:14:41.979 }, 00:14:41.979 { 00:14:41.979 "name": "pt2", 00:14:41.979 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:41.979 "is_configured": true, 00:14:41.979 "data_offset": 2048, 00:14:41.979 "data_size": 63488 00:14:41.979 }, 00:14:41.979 { 00:14:41.979 "name": "pt3", 00:14:41.979 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:41.979 "is_configured": true, 00:14:41.979 "data_offset": 2048, 00:14:41.979 "data_size": 63488 00:14:41.979 } 00:14:41.979 ] 00:14:41.979 } 00:14:41.979 } 00:14:41.979 }' 00:14:41.979 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:41.979 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:41.979 pt2 00:14:41.979 pt3' 00:14:41.979 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:41.979 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:41.979 22:43:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:42.545 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:42.545 "name": "pt1", 00:14:42.545 "aliases": [ 00:14:42.545 "00000000-0000-0000-0000-000000000001" 00:14:42.545 ], 00:14:42.545 "product_name": "passthru", 00:14:42.545 "block_size": 512, 00:14:42.545 "num_blocks": 65536, 00:14:42.545 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:42.545 "assigned_rate_limits": { 00:14:42.545 "rw_ios_per_sec": 0, 00:14:42.545 "rw_mbytes_per_sec": 0, 00:14:42.545 "r_mbytes_per_sec": 0, 00:14:42.545 "w_mbytes_per_sec": 0 00:14:42.545 }, 00:14:42.545 "claimed": true, 00:14:42.545 "claim_type": "exclusive_write", 00:14:42.545 "zoned": false, 00:14:42.545 "supported_io_types": { 00:14:42.545 "read": true, 00:14:42.545 "write": true, 00:14:42.545 "unmap": true, 00:14:42.545 "flush": true, 00:14:42.545 "reset": true, 00:14:42.545 "nvme_admin": false, 00:14:42.545 "nvme_io": false, 00:14:42.545 "nvme_io_md": false, 00:14:42.545 "write_zeroes": true, 00:14:42.545 "zcopy": true, 00:14:42.545 "get_zone_info": false, 00:14:42.545 "zone_management": false, 00:14:42.545 "zone_append": false, 00:14:42.545 "compare": false, 00:14:42.545 "compare_and_write": false, 00:14:42.545 "abort": true, 00:14:42.545 "seek_hole": false, 00:14:42.545 "seek_data": false, 00:14:42.545 "copy": true, 00:14:42.545 "nvme_iov_md": false 00:14:42.545 }, 00:14:42.545 "memory_domains": [ 00:14:42.545 { 00:14:42.545 "dma_device_id": "system", 00:14:42.545 "dma_device_type": 1 00:14:42.545 }, 00:14:42.545 { 00:14:42.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:42.545 "dma_device_type": 2 00:14:42.545 } 00:14:42.545 ], 00:14:42.545 "driver_specific": { 00:14:42.545 "passthru": { 00:14:42.545 "name": "pt1", 00:14:42.545 "base_bdev_name": "malloc1" 00:14:42.545 } 00:14:42.545 } 00:14:42.545 }' 00:14:42.545 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.803 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:42.803 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:42.803 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.803 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:42.803 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:42.803 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:42.803 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:43.062 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:43.062 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.062 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.062 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:43.062 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:43.062 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:43.062 22:43:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:43.366 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:43.366 "name": "pt2", 00:14:43.366 "aliases": [ 00:14:43.366 "00000000-0000-0000-0000-000000000002" 00:14:43.366 ], 00:14:43.366 "product_name": "passthru", 00:14:43.366 "block_size": 512, 00:14:43.366 "num_blocks": 65536, 00:14:43.366 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:43.366 "assigned_rate_limits": { 00:14:43.366 "rw_ios_per_sec": 0, 00:14:43.366 "rw_mbytes_per_sec": 0, 00:14:43.366 "r_mbytes_per_sec": 0, 00:14:43.366 "w_mbytes_per_sec": 0 00:14:43.366 }, 00:14:43.366 "claimed": true, 00:14:43.366 "claim_type": "exclusive_write", 00:14:43.366 "zoned": false, 00:14:43.366 "supported_io_types": { 00:14:43.366 "read": true, 00:14:43.366 "write": true, 00:14:43.366 "unmap": true, 00:14:43.366 "flush": true, 00:14:43.366 "reset": true, 00:14:43.366 "nvme_admin": false, 00:14:43.366 "nvme_io": false, 00:14:43.366 "nvme_io_md": false, 00:14:43.366 "write_zeroes": true, 00:14:43.366 "zcopy": true, 00:14:43.366 "get_zone_info": false, 00:14:43.366 "zone_management": false, 00:14:43.366 "zone_append": false, 00:14:43.366 "compare": false, 00:14:43.366 "compare_and_write": false, 00:14:43.366 "abort": true, 00:14:43.366 "seek_hole": false, 00:14:43.366 "seek_data": false, 00:14:43.366 "copy": true, 00:14:43.366 "nvme_iov_md": false 00:14:43.366 }, 00:14:43.366 "memory_domains": [ 00:14:43.366 { 00:14:43.366 "dma_device_id": "system", 00:14:43.366 "dma_device_type": 1 00:14:43.366 }, 00:14:43.366 { 00:14:43.366 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.366 "dma_device_type": 2 00:14:43.366 } 00:14:43.366 ], 00:14:43.366 "driver_specific": { 00:14:43.366 "passthru": { 00:14:43.366 "name": "pt2", 00:14:43.366 "base_bdev_name": "malloc2" 00:14:43.366 } 00:14:43.366 } 00:14:43.366 }' 00:14:43.366 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:43.366 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:43.366 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:43.366 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:43.634 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:43.893 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:43.893 "name": "pt3", 00:14:43.893 "aliases": [ 00:14:43.893 "00000000-0000-0000-0000-000000000003" 00:14:43.893 ], 00:14:43.893 "product_name": "passthru", 00:14:43.893 "block_size": 512, 00:14:43.893 "num_blocks": 65536, 00:14:43.893 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:43.893 "assigned_rate_limits": { 00:14:43.893 "rw_ios_per_sec": 0, 00:14:43.893 "rw_mbytes_per_sec": 0, 00:14:43.893 "r_mbytes_per_sec": 0, 00:14:43.893 "w_mbytes_per_sec": 0 00:14:43.893 }, 00:14:43.893 "claimed": true, 00:14:43.893 "claim_type": "exclusive_write", 00:14:43.893 "zoned": false, 00:14:43.893 "supported_io_types": { 00:14:43.893 "read": true, 00:14:43.893 "write": true, 00:14:43.893 "unmap": true, 00:14:43.893 "flush": true, 00:14:43.893 "reset": true, 00:14:43.893 "nvme_admin": false, 00:14:43.893 "nvme_io": false, 00:14:43.893 "nvme_io_md": false, 00:14:43.893 "write_zeroes": true, 00:14:43.893 "zcopy": true, 00:14:43.893 "get_zone_info": false, 00:14:43.893 "zone_management": false, 00:14:43.893 "zone_append": false, 00:14:43.893 "compare": false, 00:14:43.893 "compare_and_write": false, 00:14:43.893 "abort": true, 00:14:43.893 "seek_hole": false, 00:14:43.893 "seek_data": false, 00:14:43.893 "copy": true, 00:14:43.893 "nvme_iov_md": false 00:14:43.893 }, 00:14:43.893 "memory_domains": [ 00:14:43.893 { 00:14:43.893 "dma_device_id": "system", 00:14:43.893 "dma_device_type": 1 00:14:43.893 }, 00:14:43.893 { 00:14:43.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.893 "dma_device_type": 2 00:14:43.893 } 00:14:43.893 ], 00:14:43.893 "driver_specific": { 00:14:43.893 "passthru": { 00:14:43.893 "name": "pt3", 00:14:43.893 "base_bdev_name": "malloc3" 00:14:43.893 } 00:14:43.893 } 00:14:43.893 }' 00:14:43.893 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.152 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.152 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.152 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.152 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.152 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:44.152 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.152 22:43:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.152 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:44.152 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.412 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.412 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:44.412 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:44.412 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:44.671 [2024-07-15 22:43:29.344758] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:44.671 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d5a7afe3-5dea-481d-bc06-d890d1330ba7 00:14:44.671 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d5a7afe3-5dea-481d-bc06-d890d1330ba7 ']' 00:14:44.671 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:44.931 [2024-07-15 22:43:29.589123] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:44.931 [2024-07-15 22:43:29.589142] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:44.931 [2024-07-15 22:43:29.589191] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:44.931 [2024-07-15 22:43:29.589245] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:44.931 [2024-07-15 22:43:29.589257] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x851ea0 name raid_bdev1, state offline 00:14:44.931 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.931 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:45.190 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:45.190 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:45.190 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:45.190 22:43:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:45.190 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:45.190 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:45.449 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:45.449 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:45.708 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:45.708 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:45.967 22:43:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:46.226 [2024-07-15 22:43:31.048924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:46.226 [2024-07-15 22:43:31.050266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:46.226 [2024-07-15 22:43:31.050309] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:46.226 [2024-07-15 22:43:31.050355] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:46.226 [2024-07-15 22:43:31.050392] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:46.226 [2024-07-15 22:43:31.050414] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:46.226 [2024-07-15 22:43:31.050432] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:46.226 [2024-07-15 22:43:31.050442] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9fcff0 name raid_bdev1, state configuring 00:14:46.226 request: 00:14:46.226 { 00:14:46.226 "name": "raid_bdev1", 00:14:46.226 "raid_level": "raid0", 00:14:46.226 "base_bdevs": [ 00:14:46.226 "malloc1", 00:14:46.226 "malloc2", 00:14:46.226 "malloc3" 00:14:46.226 ], 00:14:46.226 "strip_size_kb": 64, 00:14:46.226 "superblock": false, 00:14:46.226 "method": "bdev_raid_create", 00:14:46.226 "req_id": 1 00:14:46.226 } 00:14:46.226 Got JSON-RPC error response 00:14:46.226 response: 00:14:46.226 { 00:14:46.226 "code": -17, 00:14:46.226 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:46.226 } 00:14:46.226 22:43:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:46.226 22:43:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:46.226 22:43:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:46.226 22:43:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:46.226 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.226 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:46.485 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:46.485 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:46.485 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:46.745 [2024-07-15 22:43:31.538151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:46.745 [2024-07-15 22:43:31.538191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:46.745 [2024-07-15 22:43:31.538217] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8597a0 00:14:46.745 [2024-07-15 22:43:31.538229] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:46.745 [2024-07-15 22:43:31.539864] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:46.745 [2024-07-15 22:43:31.539892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:46.745 [2024-07-15 22:43:31.539968] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:46.745 [2024-07-15 22:43:31.539995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:46.745 pt1 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.745 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:47.039 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.039 "name": "raid_bdev1", 00:14:47.039 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:47.039 "strip_size_kb": 64, 00:14:47.039 "state": "configuring", 00:14:47.039 "raid_level": "raid0", 00:14:47.039 "superblock": true, 00:14:47.039 "num_base_bdevs": 3, 00:14:47.039 "num_base_bdevs_discovered": 1, 00:14:47.039 "num_base_bdevs_operational": 3, 00:14:47.039 "base_bdevs_list": [ 00:14:47.039 { 00:14:47.039 "name": "pt1", 00:14:47.039 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:47.039 "is_configured": true, 00:14:47.039 "data_offset": 2048, 00:14:47.039 "data_size": 63488 00:14:47.039 }, 00:14:47.039 { 00:14:47.039 "name": null, 00:14:47.039 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:47.039 "is_configured": false, 00:14:47.039 "data_offset": 2048, 00:14:47.039 "data_size": 63488 00:14:47.039 }, 00:14:47.039 { 00:14:47.039 "name": null, 00:14:47.039 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:47.039 "is_configured": false, 00:14:47.039 "data_offset": 2048, 00:14:47.039 "data_size": 63488 00:14:47.039 } 00:14:47.039 ] 00:14:47.039 }' 00:14:47.039 22:43:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.039 22:43:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.607 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:47.607 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:47.866 [2024-07-15 22:43:32.617020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:47.866 [2024-07-15 22:43:32.617067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:47.866 [2024-07-15 22:43:32.617086] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x850c70 00:14:47.866 [2024-07-15 22:43:32.617098] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:47.866 [2024-07-15 22:43:32.617439] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:47.866 [2024-07-15 22:43:32.617457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:47.866 [2024-07-15 22:43:32.617516] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:47.866 [2024-07-15 22:43:32.617536] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:47.866 pt2 00:14:47.866 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:48.125 [2024-07-15 22:43:32.865678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.125 22:43:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:48.384 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.384 "name": "raid_bdev1", 00:14:48.384 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:48.384 "strip_size_kb": 64, 00:14:48.384 "state": "configuring", 00:14:48.384 "raid_level": "raid0", 00:14:48.384 "superblock": true, 00:14:48.384 "num_base_bdevs": 3, 00:14:48.384 "num_base_bdevs_discovered": 1, 00:14:48.384 "num_base_bdevs_operational": 3, 00:14:48.384 "base_bdevs_list": [ 00:14:48.384 { 00:14:48.384 "name": "pt1", 00:14:48.384 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:48.384 "is_configured": true, 00:14:48.384 "data_offset": 2048, 00:14:48.384 "data_size": 63488 00:14:48.384 }, 00:14:48.384 { 00:14:48.384 "name": null, 00:14:48.384 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:48.384 "is_configured": false, 00:14:48.384 "data_offset": 2048, 00:14:48.384 "data_size": 63488 00:14:48.384 }, 00:14:48.384 { 00:14:48.384 "name": null, 00:14:48.384 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:48.384 "is_configured": false, 00:14:48.384 "data_offset": 2048, 00:14:48.384 "data_size": 63488 00:14:48.384 } 00:14:48.384 ] 00:14:48.384 }' 00:14:48.384 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.384 22:43:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.951 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:48.951 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:48.951 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:48.951 [2024-07-15 22:43:33.816204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:48.951 [2024-07-15 22:43:33.816250] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:48.951 [2024-07-15 22:43:33.816269] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f1fa0 00:14:48.951 [2024-07-15 22:43:33.816282] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:48.951 [2024-07-15 22:43:33.816611] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:48.951 [2024-07-15 22:43:33.816628] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:48.951 [2024-07-15 22:43:33.816687] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:48.951 [2024-07-15 22:43:33.816707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:48.951 pt2 00:14:48.951 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:48.951 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:48.951 22:43:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:49.209 [2024-07-15 22:43:34.064859] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:49.209 [2024-07-15 22:43:34.064891] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:49.209 [2024-07-15 22:43:34.064906] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f2b30 00:14:49.209 [2024-07-15 22:43:34.064919] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:49.209 [2024-07-15 22:43:34.065194] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:49.209 [2024-07-15 22:43:34.065211] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:49.209 [2024-07-15 22:43:34.065258] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:49.209 [2024-07-15 22:43:34.065275] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:49.209 [2024-07-15 22:43:34.065375] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9f3c00 00:14:49.209 [2024-07-15 22:43:34.065385] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:49.209 [2024-07-15 22:43:34.065547] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9fc9b0 00:14:49.209 [2024-07-15 22:43:34.065667] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9f3c00 00:14:49.210 [2024-07-15 22:43:34.065677] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9f3c00 00:14:49.210 [2024-07-15 22:43:34.065769] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:49.210 pt3 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.210 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:49.468 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.468 "name": "raid_bdev1", 00:14:49.468 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:49.468 "strip_size_kb": 64, 00:14:49.468 "state": "online", 00:14:49.468 "raid_level": "raid0", 00:14:49.468 "superblock": true, 00:14:49.468 "num_base_bdevs": 3, 00:14:49.468 "num_base_bdevs_discovered": 3, 00:14:49.468 "num_base_bdevs_operational": 3, 00:14:49.468 "base_bdevs_list": [ 00:14:49.468 { 00:14:49.468 "name": "pt1", 00:14:49.468 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:49.468 "is_configured": true, 00:14:49.468 "data_offset": 2048, 00:14:49.468 "data_size": 63488 00:14:49.468 }, 00:14:49.468 { 00:14:49.468 "name": "pt2", 00:14:49.468 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:49.468 "is_configured": true, 00:14:49.468 "data_offset": 2048, 00:14:49.468 "data_size": 63488 00:14:49.468 }, 00:14:49.468 { 00:14:49.468 "name": "pt3", 00:14:49.468 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:49.468 "is_configured": true, 00:14:49.468 "data_offset": 2048, 00:14:49.468 "data_size": 63488 00:14:49.468 } 00:14:49.468 ] 00:14:49.468 }' 00:14:49.468 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.468 22:43:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:50.037 22:43:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:50.296 [2024-07-15 22:43:35.164058] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:50.296 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:50.296 "name": "raid_bdev1", 00:14:50.296 "aliases": [ 00:14:50.296 "d5a7afe3-5dea-481d-bc06-d890d1330ba7" 00:14:50.296 ], 00:14:50.296 "product_name": "Raid Volume", 00:14:50.296 "block_size": 512, 00:14:50.296 "num_blocks": 190464, 00:14:50.296 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:50.296 "assigned_rate_limits": { 00:14:50.296 "rw_ios_per_sec": 0, 00:14:50.296 "rw_mbytes_per_sec": 0, 00:14:50.296 "r_mbytes_per_sec": 0, 00:14:50.296 "w_mbytes_per_sec": 0 00:14:50.296 }, 00:14:50.296 "claimed": false, 00:14:50.296 "zoned": false, 00:14:50.296 "supported_io_types": { 00:14:50.296 "read": true, 00:14:50.296 "write": true, 00:14:50.296 "unmap": true, 00:14:50.296 "flush": true, 00:14:50.296 "reset": true, 00:14:50.296 "nvme_admin": false, 00:14:50.296 "nvme_io": false, 00:14:50.296 "nvme_io_md": false, 00:14:50.296 "write_zeroes": true, 00:14:50.296 "zcopy": false, 00:14:50.296 "get_zone_info": false, 00:14:50.296 "zone_management": false, 00:14:50.296 "zone_append": false, 00:14:50.296 "compare": false, 00:14:50.296 "compare_and_write": false, 00:14:50.296 "abort": false, 00:14:50.296 "seek_hole": false, 00:14:50.296 "seek_data": false, 00:14:50.296 "copy": false, 00:14:50.296 "nvme_iov_md": false 00:14:50.296 }, 00:14:50.296 "memory_domains": [ 00:14:50.296 { 00:14:50.296 "dma_device_id": "system", 00:14:50.296 "dma_device_type": 1 00:14:50.296 }, 00:14:50.296 { 00:14:50.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.296 "dma_device_type": 2 00:14:50.296 }, 00:14:50.296 { 00:14:50.296 "dma_device_id": "system", 00:14:50.296 "dma_device_type": 1 00:14:50.296 }, 00:14:50.296 { 00:14:50.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.296 "dma_device_type": 2 00:14:50.296 }, 00:14:50.296 { 00:14:50.296 "dma_device_id": "system", 00:14:50.296 "dma_device_type": 1 00:14:50.296 }, 00:14:50.296 { 00:14:50.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.296 "dma_device_type": 2 00:14:50.296 } 00:14:50.296 ], 00:14:50.296 "driver_specific": { 00:14:50.296 "raid": { 00:14:50.296 "uuid": "d5a7afe3-5dea-481d-bc06-d890d1330ba7", 00:14:50.296 "strip_size_kb": 64, 00:14:50.296 "state": "online", 00:14:50.296 "raid_level": "raid0", 00:14:50.296 "superblock": true, 00:14:50.296 "num_base_bdevs": 3, 00:14:50.296 "num_base_bdevs_discovered": 3, 00:14:50.296 "num_base_bdevs_operational": 3, 00:14:50.296 "base_bdevs_list": [ 00:14:50.296 { 00:14:50.296 "name": "pt1", 00:14:50.296 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:50.296 "is_configured": true, 00:14:50.296 "data_offset": 2048, 00:14:50.296 "data_size": 63488 00:14:50.296 }, 00:14:50.296 { 00:14:50.296 "name": "pt2", 00:14:50.296 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:50.296 "is_configured": true, 00:14:50.296 "data_offset": 2048, 00:14:50.296 "data_size": 63488 00:14:50.296 }, 00:14:50.296 { 00:14:50.296 "name": "pt3", 00:14:50.296 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:50.296 "is_configured": true, 00:14:50.296 "data_offset": 2048, 00:14:50.296 "data_size": 63488 00:14:50.296 } 00:14:50.296 ] 00:14:50.296 } 00:14:50.296 } 00:14:50.296 }' 00:14:50.296 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:50.555 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:50.555 pt2 00:14:50.555 pt3' 00:14:50.555 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:50.555 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:50.555 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:50.814 "name": "pt1", 00:14:50.814 "aliases": [ 00:14:50.814 "00000000-0000-0000-0000-000000000001" 00:14:50.814 ], 00:14:50.814 "product_name": "passthru", 00:14:50.814 "block_size": 512, 00:14:50.814 "num_blocks": 65536, 00:14:50.814 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:50.814 "assigned_rate_limits": { 00:14:50.814 "rw_ios_per_sec": 0, 00:14:50.814 "rw_mbytes_per_sec": 0, 00:14:50.814 "r_mbytes_per_sec": 0, 00:14:50.814 "w_mbytes_per_sec": 0 00:14:50.814 }, 00:14:50.814 "claimed": true, 00:14:50.814 "claim_type": "exclusive_write", 00:14:50.814 "zoned": false, 00:14:50.814 "supported_io_types": { 00:14:50.814 "read": true, 00:14:50.814 "write": true, 00:14:50.814 "unmap": true, 00:14:50.814 "flush": true, 00:14:50.814 "reset": true, 00:14:50.814 "nvme_admin": false, 00:14:50.814 "nvme_io": false, 00:14:50.814 "nvme_io_md": false, 00:14:50.814 "write_zeroes": true, 00:14:50.814 "zcopy": true, 00:14:50.814 "get_zone_info": false, 00:14:50.814 "zone_management": false, 00:14:50.814 "zone_append": false, 00:14:50.814 "compare": false, 00:14:50.814 "compare_and_write": false, 00:14:50.814 "abort": true, 00:14:50.814 "seek_hole": false, 00:14:50.814 "seek_data": false, 00:14:50.814 "copy": true, 00:14:50.814 "nvme_iov_md": false 00:14:50.814 }, 00:14:50.814 "memory_domains": [ 00:14:50.814 { 00:14:50.814 "dma_device_id": "system", 00:14:50.814 "dma_device_type": 1 00:14:50.814 }, 00:14:50.814 { 00:14:50.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:50.814 "dma_device_type": 2 00:14:50.814 } 00:14:50.814 ], 00:14:50.814 "driver_specific": { 00:14:50.814 "passthru": { 00:14:50.814 "name": "pt1", 00:14:50.814 "base_bdev_name": "malloc1" 00:14:50.814 } 00:14:50.814 } 00:14:50.814 }' 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:50.814 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:51.073 22:43:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:51.332 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:51.332 "name": "pt2", 00:14:51.332 "aliases": [ 00:14:51.332 "00000000-0000-0000-0000-000000000002" 00:14:51.332 ], 00:14:51.332 "product_name": "passthru", 00:14:51.332 "block_size": 512, 00:14:51.332 "num_blocks": 65536, 00:14:51.332 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:51.332 "assigned_rate_limits": { 00:14:51.332 "rw_ios_per_sec": 0, 00:14:51.332 "rw_mbytes_per_sec": 0, 00:14:51.332 "r_mbytes_per_sec": 0, 00:14:51.332 "w_mbytes_per_sec": 0 00:14:51.332 }, 00:14:51.332 "claimed": true, 00:14:51.332 "claim_type": "exclusive_write", 00:14:51.332 "zoned": false, 00:14:51.332 "supported_io_types": { 00:14:51.332 "read": true, 00:14:51.332 "write": true, 00:14:51.332 "unmap": true, 00:14:51.332 "flush": true, 00:14:51.332 "reset": true, 00:14:51.332 "nvme_admin": false, 00:14:51.332 "nvme_io": false, 00:14:51.332 "nvme_io_md": false, 00:14:51.332 "write_zeroes": true, 00:14:51.332 "zcopy": true, 00:14:51.332 "get_zone_info": false, 00:14:51.332 "zone_management": false, 00:14:51.332 "zone_append": false, 00:14:51.332 "compare": false, 00:14:51.332 "compare_and_write": false, 00:14:51.332 "abort": true, 00:14:51.332 "seek_hole": false, 00:14:51.332 "seek_data": false, 00:14:51.332 "copy": true, 00:14:51.332 "nvme_iov_md": false 00:14:51.332 }, 00:14:51.332 "memory_domains": [ 00:14:51.332 { 00:14:51.332 "dma_device_id": "system", 00:14:51.332 "dma_device_type": 1 00:14:51.332 }, 00:14:51.332 { 00:14:51.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.332 "dma_device_type": 2 00:14:51.332 } 00:14:51.332 ], 00:14:51.332 "driver_specific": { 00:14:51.332 "passthru": { 00:14:51.332 "name": "pt2", 00:14:51.332 "base_bdev_name": "malloc2" 00:14:51.332 } 00:14:51.332 } 00:14:51.332 }' 00:14:51.332 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.332 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.332 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:51.332 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.332 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:51.591 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:51.851 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:51.851 "name": "pt3", 00:14:51.851 "aliases": [ 00:14:51.851 "00000000-0000-0000-0000-000000000003" 00:14:51.851 ], 00:14:51.851 "product_name": "passthru", 00:14:51.851 "block_size": 512, 00:14:51.851 "num_blocks": 65536, 00:14:51.851 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:51.851 "assigned_rate_limits": { 00:14:51.851 "rw_ios_per_sec": 0, 00:14:51.851 "rw_mbytes_per_sec": 0, 00:14:51.851 "r_mbytes_per_sec": 0, 00:14:51.851 "w_mbytes_per_sec": 0 00:14:51.851 }, 00:14:51.851 "claimed": true, 00:14:51.851 "claim_type": "exclusive_write", 00:14:51.851 "zoned": false, 00:14:51.851 "supported_io_types": { 00:14:51.851 "read": true, 00:14:51.851 "write": true, 00:14:51.851 "unmap": true, 00:14:51.851 "flush": true, 00:14:51.851 "reset": true, 00:14:51.851 "nvme_admin": false, 00:14:51.851 "nvme_io": false, 00:14:51.851 "nvme_io_md": false, 00:14:51.851 "write_zeroes": true, 00:14:51.851 "zcopy": true, 00:14:51.851 "get_zone_info": false, 00:14:51.851 "zone_management": false, 00:14:51.851 "zone_append": false, 00:14:51.851 "compare": false, 00:14:51.851 "compare_and_write": false, 00:14:51.851 "abort": true, 00:14:51.851 "seek_hole": false, 00:14:51.851 "seek_data": false, 00:14:51.851 "copy": true, 00:14:51.851 "nvme_iov_md": false 00:14:51.851 }, 00:14:51.851 "memory_domains": [ 00:14:51.851 { 00:14:51.851 "dma_device_id": "system", 00:14:51.851 "dma_device_type": 1 00:14:51.851 }, 00:14:51.851 { 00:14:51.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.851 "dma_device_type": 2 00:14:51.851 } 00:14:51.851 ], 00:14:51.851 "driver_specific": { 00:14:51.851 "passthru": { 00:14:51.851 "name": "pt3", 00:14:51.851 "base_bdev_name": "malloc3" 00:14:51.851 } 00:14:51.851 } 00:14:51.851 }' 00:14:51.851 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:51.851 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.110 22:43:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.369 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.369 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.369 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:52.369 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:52.629 [2024-07-15 22:43:37.297850] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d5a7afe3-5dea-481d-bc06-d890d1330ba7 '!=' d5a7afe3-5dea-481d-bc06-d890d1330ba7 ']' 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2728013 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2728013 ']' 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2728013 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2728013 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2728013' 00:14:52.629 killing process with pid 2728013 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2728013 00:14:52.629 [2024-07-15 22:43:37.368275] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:52.629 [2024-07-15 22:43:37.368330] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:52.629 [2024-07-15 22:43:37.368384] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:52.629 [2024-07-15 22:43:37.368395] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9f3c00 name raid_bdev1, state offline 00:14:52.629 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2728013 00:14:52.629 [2024-07-15 22:43:37.399741] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:52.889 22:43:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:52.889 00:14:52.889 real 0m14.703s 00:14:52.889 user 0m26.650s 00:14:52.889 sys 0m2.579s 00:14:52.889 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:52.889 22:43:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.889 ************************************ 00:14:52.889 END TEST raid_superblock_test 00:14:52.889 ************************************ 00:14:52.889 22:43:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:52.889 22:43:37 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:52.889 22:43:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:52.889 22:43:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:52.889 22:43:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:52.889 ************************************ 00:14:52.889 START TEST raid_read_error_test 00:14:52.889 ************************************ 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.cftxA48Mps 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2730228 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2730228 /var/tmp/spdk-raid.sock 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2730228 ']' 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:52.889 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:52.889 22:43:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.889 [2024-07-15 22:43:37.793579] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:14:52.889 [2024-07-15 22:43:37.793650] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2730228 ] 00:14:53.149 [2024-07-15 22:43:37.916338] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:53.149 [2024-07-15 22:43:38.022430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:53.408 [2024-07-15 22:43:38.092844] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:53.408 [2024-07-15 22:43:38.092890] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:53.976 22:43:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:53.976 22:43:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:53.976 22:43:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:53.976 22:43:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:54.235 BaseBdev1_malloc 00:14:54.235 22:43:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:54.494 true 00:14:54.494 22:43:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:54.753 [2024-07-15 22:43:39.476354] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:54.753 [2024-07-15 22:43:39.476397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:54.753 [2024-07-15 22:43:39.476416] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d200d0 00:14:54.753 [2024-07-15 22:43:39.476429] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:54.753 [2024-07-15 22:43:39.478163] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:54.753 [2024-07-15 22:43:39.478193] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:54.753 BaseBdev1 00:14:54.753 22:43:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:54.753 22:43:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:55.012 BaseBdev2_malloc 00:14:55.012 22:43:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:55.270 true 00:14:55.270 22:43:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:55.836 [2024-07-15 22:43:40.499651] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:55.836 [2024-07-15 22:43:40.499702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.836 [2024-07-15 22:43:40.499727] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d24910 00:14:55.836 [2024-07-15 22:43:40.499739] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.836 [2024-07-15 22:43:40.501435] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.836 [2024-07-15 22:43:40.501465] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:55.836 BaseBdev2 00:14:55.836 22:43:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:55.836 22:43:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:56.093 BaseBdev3_malloc 00:14:56.093 22:43:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:56.351 true 00:14:56.351 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:56.609 [2024-07-15 22:43:41.262522] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:56.609 [2024-07-15 22:43:41.262564] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:56.609 [2024-07-15 22:43:41.262583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d26bd0 00:14:56.609 [2024-07-15 22:43:41.262596] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:56.609 [2024-07-15 22:43:41.263989] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:56.609 [2024-07-15 22:43:41.264014] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:56.609 BaseBdev3 00:14:56.609 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:56.609 [2024-07-15 22:43:41.511211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:56.609 [2024-07-15 22:43:41.512407] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:56.609 [2024-07-15 22:43:41.512472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:56.609 [2024-07-15 22:43:41.512682] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d28280 00:14:56.609 [2024-07-15 22:43:41.512694] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:56.609 [2024-07-15 22:43:41.512868] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d27e20 00:14:56.609 [2024-07-15 22:43:41.513015] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d28280 00:14:56.609 [2024-07-15 22:43:41.513026] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d28280 00:14:56.609 [2024-07-15 22:43:41.513120] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:56.867 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.125 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.125 "name": "raid_bdev1", 00:14:57.125 "uuid": "5ed122e0-c136-4530-9287-8057e6a73e60", 00:14:57.125 "strip_size_kb": 64, 00:14:57.125 "state": "online", 00:14:57.125 "raid_level": "raid0", 00:14:57.125 "superblock": true, 00:14:57.125 "num_base_bdevs": 3, 00:14:57.125 "num_base_bdevs_discovered": 3, 00:14:57.125 "num_base_bdevs_operational": 3, 00:14:57.125 "base_bdevs_list": [ 00:14:57.125 { 00:14:57.125 "name": "BaseBdev1", 00:14:57.125 "uuid": "5af852e3-d0af-5cd0-8973-789d13efe262", 00:14:57.125 "is_configured": true, 00:14:57.125 "data_offset": 2048, 00:14:57.125 "data_size": 63488 00:14:57.125 }, 00:14:57.125 { 00:14:57.125 "name": "BaseBdev2", 00:14:57.125 "uuid": "bd3cc885-45af-581d-a353-f762324020ef", 00:14:57.125 "is_configured": true, 00:14:57.125 "data_offset": 2048, 00:14:57.125 "data_size": 63488 00:14:57.125 }, 00:14:57.125 { 00:14:57.125 "name": "BaseBdev3", 00:14:57.125 "uuid": "0410c10d-28e7-5a8b-a8ca-6796ad2a78e9", 00:14:57.125 "is_configured": true, 00:14:57.125 "data_offset": 2048, 00:14:57.125 "data_size": 63488 00:14:57.125 } 00:14:57.125 ] 00:14:57.125 }' 00:14:57.125 22:43:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.125 22:43:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.757 22:43:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:57.757 22:43:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:57.757 [2024-07-15 22:43:42.494133] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b765b0 00:14:58.694 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.953 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:59.212 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.212 "name": "raid_bdev1", 00:14:59.212 "uuid": "5ed122e0-c136-4530-9287-8057e6a73e60", 00:14:59.212 "strip_size_kb": 64, 00:14:59.212 "state": "online", 00:14:59.212 "raid_level": "raid0", 00:14:59.212 "superblock": true, 00:14:59.212 "num_base_bdevs": 3, 00:14:59.212 "num_base_bdevs_discovered": 3, 00:14:59.212 "num_base_bdevs_operational": 3, 00:14:59.212 "base_bdevs_list": [ 00:14:59.212 { 00:14:59.212 "name": "BaseBdev1", 00:14:59.212 "uuid": "5af852e3-d0af-5cd0-8973-789d13efe262", 00:14:59.212 "is_configured": true, 00:14:59.212 "data_offset": 2048, 00:14:59.212 "data_size": 63488 00:14:59.212 }, 00:14:59.212 { 00:14:59.212 "name": "BaseBdev2", 00:14:59.212 "uuid": "bd3cc885-45af-581d-a353-f762324020ef", 00:14:59.212 "is_configured": true, 00:14:59.212 "data_offset": 2048, 00:14:59.212 "data_size": 63488 00:14:59.212 }, 00:14:59.212 { 00:14:59.212 "name": "BaseBdev3", 00:14:59.212 "uuid": "0410c10d-28e7-5a8b-a8ca-6796ad2a78e9", 00:14:59.212 "is_configured": true, 00:14:59.212 "data_offset": 2048, 00:14:59.212 "data_size": 63488 00:14:59.212 } 00:14:59.212 ] 00:14:59.212 }' 00:14:59.212 22:43:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.212 22:43:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.802 22:43:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:00.061 [2024-07-15 22:43:44.776261] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:00.061 [2024-07-15 22:43:44.776303] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:00.061 [2024-07-15 22:43:44.779491] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:00.061 [2024-07-15 22:43:44.779531] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:00.061 [2024-07-15 22:43:44.779566] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:00.061 [2024-07-15 22:43:44.779577] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d28280 name raid_bdev1, state offline 00:15:00.061 0 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2730228 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2730228 ']' 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2730228 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2730228 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2730228' 00:15:00.061 killing process with pid 2730228 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2730228 00:15:00.061 [2024-07-15 22:43:44.849563] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:00.061 22:43:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2730228 00:15:00.061 [2024-07-15 22:43:44.870948] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.cftxA48Mps 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:15:00.320 00:15:00.320 real 0m7.397s 00:15:00.320 user 0m11.838s 00:15:00.320 sys 0m1.307s 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:00.320 22:43:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.320 ************************************ 00:15:00.320 END TEST raid_read_error_test 00:15:00.320 ************************************ 00:15:00.320 22:43:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:00.320 22:43:45 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:15:00.320 22:43:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:00.320 22:43:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:00.320 22:43:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:00.320 ************************************ 00:15:00.320 START TEST raid_write_error_test 00:15:00.320 ************************************ 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.PARHoiqtmk 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2731378 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2731378 /var/tmp/spdk-raid.sock 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2731378 ']' 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:00.320 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:00.320 22:43:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.579 [2024-07-15 22:43:45.316418] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:15:00.579 [2024-07-15 22:43:45.316553] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2731378 ] 00:15:00.838 [2024-07-15 22:43:45.511830] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:00.838 [2024-07-15 22:43:45.608727] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:00.838 [2024-07-15 22:43:45.669447] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:00.838 [2024-07-15 22:43:45.669483] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:01.406 22:43:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:01.406 22:43:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:01.406 22:43:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:01.406 22:43:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:01.664 BaseBdev1_malloc 00:15:01.664 22:43:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:02.232 true 00:15:02.232 22:43:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:02.491 [2024-07-15 22:43:47.295439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:02.491 [2024-07-15 22:43:47.295484] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.491 [2024-07-15 22:43:47.295504] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24540d0 00:15:02.491 [2024-07-15 22:43:47.295517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.491 [2024-07-15 22:43:47.297366] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.491 [2024-07-15 22:43:47.297395] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:02.491 BaseBdev1 00:15:02.491 22:43:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:02.491 22:43:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:03.058 BaseBdev2_malloc 00:15:03.058 22:43:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:03.317 true 00:15:03.317 22:43:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:03.577 [2024-07-15 22:43:48.396183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:03.577 [2024-07-15 22:43:48.396228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:03.577 [2024-07-15 22:43:48.396249] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2458910 00:15:03.577 [2024-07-15 22:43:48.396262] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:03.577 [2024-07-15 22:43:48.397833] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:03.577 [2024-07-15 22:43:48.397861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:03.577 BaseBdev2 00:15:03.577 22:43:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:03.577 22:43:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:03.836 BaseBdev3_malloc 00:15:03.836 22:43:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:04.404 true 00:15:04.404 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:04.663 [2024-07-15 22:43:49.447755] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:04.663 [2024-07-15 22:43:49.447802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:04.663 [2024-07-15 22:43:49.447822] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x245abd0 00:15:04.663 [2024-07-15 22:43:49.447835] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:04.663 [2024-07-15 22:43:49.449457] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:04.663 [2024-07-15 22:43:49.449485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:04.663 BaseBdev3 00:15:04.663 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:04.922 [2024-07-15 22:43:49.740561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:04.922 [2024-07-15 22:43:49.741958] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:04.922 [2024-07-15 22:43:49.742029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:04.922 [2024-07-15 22:43:49.742243] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x245c280 00:15:04.923 [2024-07-15 22:43:49.742255] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:04.923 [2024-07-15 22:43:49.742463] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x245be20 00:15:04.923 [2024-07-15 22:43:49.742613] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x245c280 00:15:04.923 [2024-07-15 22:43:49.742623] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x245c280 00:15:04.923 [2024-07-15 22:43:49.742730] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.923 22:43:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:05.491 22:43:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.491 "name": "raid_bdev1", 00:15:05.491 "uuid": "5905bba6-4bda-4c9d-8f21-c01e13ab9fac", 00:15:05.491 "strip_size_kb": 64, 00:15:05.491 "state": "online", 00:15:05.491 "raid_level": "raid0", 00:15:05.491 "superblock": true, 00:15:05.491 "num_base_bdevs": 3, 00:15:05.491 "num_base_bdevs_discovered": 3, 00:15:05.491 "num_base_bdevs_operational": 3, 00:15:05.491 "base_bdevs_list": [ 00:15:05.491 { 00:15:05.491 "name": "BaseBdev1", 00:15:05.491 "uuid": "97b33643-1309-5a77-862d-6a1ade662da5", 00:15:05.491 "is_configured": true, 00:15:05.491 "data_offset": 2048, 00:15:05.491 "data_size": 63488 00:15:05.491 }, 00:15:05.491 { 00:15:05.491 "name": "BaseBdev2", 00:15:05.491 "uuid": "198f4289-d79d-5972-86e3-9eff1a9a8773", 00:15:05.491 "is_configured": true, 00:15:05.491 "data_offset": 2048, 00:15:05.491 "data_size": 63488 00:15:05.491 }, 00:15:05.491 { 00:15:05.491 "name": "BaseBdev3", 00:15:05.491 "uuid": "03c0f16a-2389-5d22-a835-21e154bbc3bb", 00:15:05.491 "is_configured": true, 00:15:05.491 "data_offset": 2048, 00:15:05.491 "data_size": 63488 00:15:05.491 } 00:15:05.491 ] 00:15:05.491 }' 00:15:05.491 22:43:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.491 22:43:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.462 22:43:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:06.462 22:43:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:06.462 [2024-07-15 22:43:51.116480] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22aa5b0 00:15:07.400 22:43:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.660 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:07.920 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.920 "name": "raid_bdev1", 00:15:07.920 "uuid": "5905bba6-4bda-4c9d-8f21-c01e13ab9fac", 00:15:07.920 "strip_size_kb": 64, 00:15:07.920 "state": "online", 00:15:07.920 "raid_level": "raid0", 00:15:07.920 "superblock": true, 00:15:07.920 "num_base_bdevs": 3, 00:15:07.920 "num_base_bdevs_discovered": 3, 00:15:07.920 "num_base_bdevs_operational": 3, 00:15:07.920 "base_bdevs_list": [ 00:15:07.920 { 00:15:07.920 "name": "BaseBdev1", 00:15:07.920 "uuid": "97b33643-1309-5a77-862d-6a1ade662da5", 00:15:07.920 "is_configured": true, 00:15:07.920 "data_offset": 2048, 00:15:07.920 "data_size": 63488 00:15:07.920 }, 00:15:07.920 { 00:15:07.920 "name": "BaseBdev2", 00:15:07.920 "uuid": "198f4289-d79d-5972-86e3-9eff1a9a8773", 00:15:07.920 "is_configured": true, 00:15:07.920 "data_offset": 2048, 00:15:07.920 "data_size": 63488 00:15:07.920 }, 00:15:07.920 { 00:15:07.920 "name": "BaseBdev3", 00:15:07.920 "uuid": "03c0f16a-2389-5d22-a835-21e154bbc3bb", 00:15:07.920 "is_configured": true, 00:15:07.920 "data_offset": 2048, 00:15:07.920 "data_size": 63488 00:15:07.920 } 00:15:07.920 ] 00:15:07.920 }' 00:15:07.920 22:43:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.920 22:43:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.858 22:43:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:09.116 [2024-07-15 22:43:53.872849] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:09.116 [2024-07-15 22:43:53.872888] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:09.116 [2024-07-15 22:43:53.876057] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:09.116 [2024-07-15 22:43:53.876094] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:09.116 [2024-07-15 22:43:53.876129] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:09.116 [2024-07-15 22:43:53.876140] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x245c280 name raid_bdev1, state offline 00:15:09.116 0 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2731378 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2731378 ']' 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2731378 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2731378 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2731378' 00:15:09.116 killing process with pid 2731378 00:15:09.116 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2731378 00:15:09.116 [2024-07-15 22:43:53.947624] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:09.117 22:43:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2731378 00:15:09.117 [2024-07-15 22:43:53.968913] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.PARHoiqtmk 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.36 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.36 != \0\.\0\0 ]] 00:15:09.376 00:15:09.376 real 0m9.008s 00:15:09.376 user 0m14.917s 00:15:09.376 sys 0m1.452s 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:09.376 22:43:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.376 ************************************ 00:15:09.376 END TEST raid_write_error_test 00:15:09.376 ************************************ 00:15:09.376 22:43:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:09.376 22:43:54 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:09.376 22:43:54 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:15:09.376 22:43:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:09.376 22:43:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:09.376 22:43:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:09.376 ************************************ 00:15:09.376 START TEST raid_state_function_test 00:15:09.376 ************************************ 00:15:09.376 22:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:15:09.376 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:09.376 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:09.376 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2732540 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2732540' 00:15:09.636 Process raid pid: 2732540 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2732540 /var/tmp/spdk-raid.sock 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2732540 ']' 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:09.636 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:09.636 22:43:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.636 [2024-07-15 22:43:54.355946] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:15:09.636 [2024-07-15 22:43:54.356024] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:09.636 [2024-07-15 22:43:54.489268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:09.896 [2024-07-15 22:43:54.595815] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:09.896 [2024-07-15 22:43:54.659514] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:09.896 [2024-07-15 22:43:54.659543] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:10.463 22:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:10.463 22:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:10.463 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:10.723 [2024-07-15 22:43:55.450105] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:10.723 [2024-07-15 22:43:55.450151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:10.723 [2024-07-15 22:43:55.450161] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:10.723 [2024-07-15 22:43:55.450173] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:10.723 [2024-07-15 22:43:55.450186] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:10.723 [2024-07-15 22:43:55.450197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.723 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.982 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.982 "name": "Existed_Raid", 00:15:10.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.982 "strip_size_kb": 64, 00:15:10.982 "state": "configuring", 00:15:10.982 "raid_level": "concat", 00:15:10.982 "superblock": false, 00:15:10.982 "num_base_bdevs": 3, 00:15:10.982 "num_base_bdevs_discovered": 0, 00:15:10.982 "num_base_bdevs_operational": 3, 00:15:10.982 "base_bdevs_list": [ 00:15:10.982 { 00:15:10.982 "name": "BaseBdev1", 00:15:10.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.982 "is_configured": false, 00:15:10.982 "data_offset": 0, 00:15:10.982 "data_size": 0 00:15:10.982 }, 00:15:10.982 { 00:15:10.982 "name": "BaseBdev2", 00:15:10.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.982 "is_configured": false, 00:15:10.982 "data_offset": 0, 00:15:10.982 "data_size": 0 00:15:10.982 }, 00:15:10.982 { 00:15:10.982 "name": "BaseBdev3", 00:15:10.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.982 "is_configured": false, 00:15:10.982 "data_offset": 0, 00:15:10.982 "data_size": 0 00:15:10.982 } 00:15:10.982 ] 00:15:10.982 }' 00:15:10.982 22:43:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.983 22:43:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.551 22:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:11.810 [2024-07-15 22:43:56.488701] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:11.810 [2024-07-15 22:43:56.488730] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dcea80 name Existed_Raid, state configuring 00:15:11.810 22:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:12.069 [2024-07-15 22:43:56.745407] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:12.069 [2024-07-15 22:43:56.745438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:12.069 [2024-07-15 22:43:56.745447] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:12.069 [2024-07-15 22:43:56.745458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:12.069 [2024-07-15 22:43:56.745467] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:12.069 [2024-07-15 22:43:56.745478] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:12.069 22:43:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:12.327 [2024-07-15 22:43:56.995908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:12.327 BaseBdev1 00:15:12.327 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:12.327 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:12.327 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:12.327 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:12.327 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:12.327 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:12.327 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:12.625 [ 00:15:12.625 { 00:15:12.625 "name": "BaseBdev1", 00:15:12.625 "aliases": [ 00:15:12.625 "7772f8e2-9aaa-4f31-8020-440983344be7" 00:15:12.625 ], 00:15:12.625 "product_name": "Malloc disk", 00:15:12.625 "block_size": 512, 00:15:12.625 "num_blocks": 65536, 00:15:12.625 "uuid": "7772f8e2-9aaa-4f31-8020-440983344be7", 00:15:12.625 "assigned_rate_limits": { 00:15:12.625 "rw_ios_per_sec": 0, 00:15:12.625 "rw_mbytes_per_sec": 0, 00:15:12.625 "r_mbytes_per_sec": 0, 00:15:12.625 "w_mbytes_per_sec": 0 00:15:12.625 }, 00:15:12.625 "claimed": true, 00:15:12.625 "claim_type": "exclusive_write", 00:15:12.625 "zoned": false, 00:15:12.625 "supported_io_types": { 00:15:12.625 "read": true, 00:15:12.625 "write": true, 00:15:12.625 "unmap": true, 00:15:12.625 "flush": true, 00:15:12.625 "reset": true, 00:15:12.625 "nvme_admin": false, 00:15:12.625 "nvme_io": false, 00:15:12.625 "nvme_io_md": false, 00:15:12.625 "write_zeroes": true, 00:15:12.625 "zcopy": true, 00:15:12.625 "get_zone_info": false, 00:15:12.625 "zone_management": false, 00:15:12.625 "zone_append": false, 00:15:12.625 "compare": false, 00:15:12.625 "compare_and_write": false, 00:15:12.625 "abort": true, 00:15:12.625 "seek_hole": false, 00:15:12.625 "seek_data": false, 00:15:12.625 "copy": true, 00:15:12.625 "nvme_iov_md": false 00:15:12.625 }, 00:15:12.625 "memory_domains": [ 00:15:12.625 { 00:15:12.625 "dma_device_id": "system", 00:15:12.625 "dma_device_type": 1 00:15:12.625 }, 00:15:12.625 { 00:15:12.625 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.625 "dma_device_type": 2 00:15:12.625 } 00:15:12.625 ], 00:15:12.625 "driver_specific": {} 00:15:12.625 } 00:15:12.625 ] 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.625 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.913 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:12.913 "name": "Existed_Raid", 00:15:12.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.913 "strip_size_kb": 64, 00:15:12.913 "state": "configuring", 00:15:12.913 "raid_level": "concat", 00:15:12.913 "superblock": false, 00:15:12.913 "num_base_bdevs": 3, 00:15:12.913 "num_base_bdevs_discovered": 1, 00:15:12.913 "num_base_bdevs_operational": 3, 00:15:12.913 "base_bdevs_list": [ 00:15:12.913 { 00:15:12.913 "name": "BaseBdev1", 00:15:12.913 "uuid": "7772f8e2-9aaa-4f31-8020-440983344be7", 00:15:12.913 "is_configured": true, 00:15:12.913 "data_offset": 0, 00:15:12.913 "data_size": 65536 00:15:12.913 }, 00:15:12.913 { 00:15:12.913 "name": "BaseBdev2", 00:15:12.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.913 "is_configured": false, 00:15:12.913 "data_offset": 0, 00:15:12.913 "data_size": 0 00:15:12.913 }, 00:15:12.913 { 00:15:12.913 "name": "BaseBdev3", 00:15:12.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:12.913 "is_configured": false, 00:15:12.913 "data_offset": 0, 00:15:12.913 "data_size": 0 00:15:12.913 } 00:15:12.913 ] 00:15:12.913 }' 00:15:12.913 22:43:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:12.913 22:43:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.480 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:13.739 [2024-07-15 22:43:58.576097] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:13.739 [2024-07-15 22:43:58.576136] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dce310 name Existed_Raid, state configuring 00:15:13.739 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:13.997 [2024-07-15 22:43:58.820780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:13.997 [2024-07-15 22:43:58.822225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:13.997 [2024-07-15 22:43:58.822256] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:13.997 [2024-07-15 22:43:58.822266] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:13.997 [2024-07-15 22:43:58.822277] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.997 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.998 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.998 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.998 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.998 22:43:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.257 22:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.257 "name": "Existed_Raid", 00:15:14.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.257 "strip_size_kb": 64, 00:15:14.257 "state": "configuring", 00:15:14.257 "raid_level": "concat", 00:15:14.257 "superblock": false, 00:15:14.257 "num_base_bdevs": 3, 00:15:14.257 "num_base_bdevs_discovered": 1, 00:15:14.257 "num_base_bdevs_operational": 3, 00:15:14.257 "base_bdevs_list": [ 00:15:14.257 { 00:15:14.257 "name": "BaseBdev1", 00:15:14.257 "uuid": "7772f8e2-9aaa-4f31-8020-440983344be7", 00:15:14.257 "is_configured": true, 00:15:14.257 "data_offset": 0, 00:15:14.257 "data_size": 65536 00:15:14.257 }, 00:15:14.257 { 00:15:14.257 "name": "BaseBdev2", 00:15:14.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.257 "is_configured": false, 00:15:14.257 "data_offset": 0, 00:15:14.257 "data_size": 0 00:15:14.257 }, 00:15:14.257 { 00:15:14.257 "name": "BaseBdev3", 00:15:14.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.257 "is_configured": false, 00:15:14.257 "data_offset": 0, 00:15:14.257 "data_size": 0 00:15:14.257 } 00:15:14.257 ] 00:15:14.257 }' 00:15:14.257 22:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.257 22:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:14.826 22:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:15.085 [2024-07-15 22:43:59.891077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:15.085 BaseBdev2 00:15:15.085 22:43:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:15.085 22:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:15.085 22:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:15.085 22:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:15.085 22:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:15.085 22:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:15.085 22:43:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.343 22:44:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:15.603 [ 00:15:15.603 { 00:15:15.603 "name": "BaseBdev2", 00:15:15.603 "aliases": [ 00:15:15.603 "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad" 00:15:15.603 ], 00:15:15.603 "product_name": "Malloc disk", 00:15:15.603 "block_size": 512, 00:15:15.603 "num_blocks": 65536, 00:15:15.603 "uuid": "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad", 00:15:15.603 "assigned_rate_limits": { 00:15:15.603 "rw_ios_per_sec": 0, 00:15:15.603 "rw_mbytes_per_sec": 0, 00:15:15.603 "r_mbytes_per_sec": 0, 00:15:15.603 "w_mbytes_per_sec": 0 00:15:15.603 }, 00:15:15.603 "claimed": true, 00:15:15.603 "claim_type": "exclusive_write", 00:15:15.603 "zoned": false, 00:15:15.603 "supported_io_types": { 00:15:15.603 "read": true, 00:15:15.603 "write": true, 00:15:15.603 "unmap": true, 00:15:15.603 "flush": true, 00:15:15.603 "reset": true, 00:15:15.603 "nvme_admin": false, 00:15:15.603 "nvme_io": false, 00:15:15.603 "nvme_io_md": false, 00:15:15.603 "write_zeroes": true, 00:15:15.603 "zcopy": true, 00:15:15.603 "get_zone_info": false, 00:15:15.603 "zone_management": false, 00:15:15.603 "zone_append": false, 00:15:15.603 "compare": false, 00:15:15.603 "compare_and_write": false, 00:15:15.603 "abort": true, 00:15:15.603 "seek_hole": false, 00:15:15.603 "seek_data": false, 00:15:15.603 "copy": true, 00:15:15.603 "nvme_iov_md": false 00:15:15.603 }, 00:15:15.603 "memory_domains": [ 00:15:15.603 { 00:15:15.603 "dma_device_id": "system", 00:15:15.603 "dma_device_type": 1 00:15:15.603 }, 00:15:15.603 { 00:15:15.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.603 "dma_device_type": 2 00:15:15.603 } 00:15:15.603 ], 00:15:15.603 "driver_specific": {} 00:15:15.603 } 00:15:15.603 ] 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.603 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:15.862 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.862 "name": "Existed_Raid", 00:15:15.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.862 "strip_size_kb": 64, 00:15:15.862 "state": "configuring", 00:15:15.862 "raid_level": "concat", 00:15:15.862 "superblock": false, 00:15:15.862 "num_base_bdevs": 3, 00:15:15.862 "num_base_bdevs_discovered": 2, 00:15:15.862 "num_base_bdevs_operational": 3, 00:15:15.862 "base_bdevs_list": [ 00:15:15.862 { 00:15:15.862 "name": "BaseBdev1", 00:15:15.862 "uuid": "7772f8e2-9aaa-4f31-8020-440983344be7", 00:15:15.862 "is_configured": true, 00:15:15.862 "data_offset": 0, 00:15:15.862 "data_size": 65536 00:15:15.862 }, 00:15:15.862 { 00:15:15.862 "name": "BaseBdev2", 00:15:15.862 "uuid": "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad", 00:15:15.862 "is_configured": true, 00:15:15.862 "data_offset": 0, 00:15:15.862 "data_size": 65536 00:15:15.863 }, 00:15:15.863 { 00:15:15.863 "name": "BaseBdev3", 00:15:15.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.863 "is_configured": false, 00:15:15.863 "data_offset": 0, 00:15:15.863 "data_size": 0 00:15:15.863 } 00:15:15.863 ] 00:15:15.863 }' 00:15:15.863 22:44:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.863 22:44:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.431 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:16.691 [2024-07-15 22:44:01.418606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:16.691 [2024-07-15 22:44:01.418645] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dcf400 00:15:16.691 [2024-07-15 22:44:01.418653] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:16.691 [2024-07-15 22:44:01.418903] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dceef0 00:15:16.691 [2024-07-15 22:44:01.419033] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dcf400 00:15:16.691 [2024-07-15 22:44:01.419044] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1dcf400 00:15:16.691 [2024-07-15 22:44:01.419210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:16.691 BaseBdev3 00:15:16.691 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:16.691 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:16.691 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:16.691 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:16.691 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:16.691 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:16.691 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:16.950 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:17.208 [ 00:15:17.209 { 00:15:17.209 "name": "BaseBdev3", 00:15:17.209 "aliases": [ 00:15:17.209 "d7a4fe0e-6042-47be-9393-e46776aa0ae2" 00:15:17.209 ], 00:15:17.209 "product_name": "Malloc disk", 00:15:17.209 "block_size": 512, 00:15:17.209 "num_blocks": 65536, 00:15:17.209 "uuid": "d7a4fe0e-6042-47be-9393-e46776aa0ae2", 00:15:17.209 "assigned_rate_limits": { 00:15:17.209 "rw_ios_per_sec": 0, 00:15:17.209 "rw_mbytes_per_sec": 0, 00:15:17.209 "r_mbytes_per_sec": 0, 00:15:17.209 "w_mbytes_per_sec": 0 00:15:17.209 }, 00:15:17.209 "claimed": true, 00:15:17.209 "claim_type": "exclusive_write", 00:15:17.209 "zoned": false, 00:15:17.209 "supported_io_types": { 00:15:17.209 "read": true, 00:15:17.209 "write": true, 00:15:17.209 "unmap": true, 00:15:17.209 "flush": true, 00:15:17.209 "reset": true, 00:15:17.209 "nvme_admin": false, 00:15:17.209 "nvme_io": false, 00:15:17.209 "nvme_io_md": false, 00:15:17.209 "write_zeroes": true, 00:15:17.209 "zcopy": true, 00:15:17.209 "get_zone_info": false, 00:15:17.209 "zone_management": false, 00:15:17.209 "zone_append": false, 00:15:17.209 "compare": false, 00:15:17.209 "compare_and_write": false, 00:15:17.209 "abort": true, 00:15:17.209 "seek_hole": false, 00:15:17.209 "seek_data": false, 00:15:17.209 "copy": true, 00:15:17.209 "nvme_iov_md": false 00:15:17.209 }, 00:15:17.209 "memory_domains": [ 00:15:17.209 { 00:15:17.209 "dma_device_id": "system", 00:15:17.209 "dma_device_type": 1 00:15:17.209 }, 00:15:17.209 { 00:15:17.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.209 "dma_device_type": 2 00:15:17.209 } 00:15:17.209 ], 00:15:17.209 "driver_specific": {} 00:15:17.209 } 00:15:17.209 ] 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.209 22:44:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.469 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.469 "name": "Existed_Raid", 00:15:17.469 "uuid": "d438caa6-53fb-4111-a257-3ccbfd21fc4d", 00:15:17.469 "strip_size_kb": 64, 00:15:17.469 "state": "online", 00:15:17.469 "raid_level": "concat", 00:15:17.469 "superblock": false, 00:15:17.469 "num_base_bdevs": 3, 00:15:17.469 "num_base_bdevs_discovered": 3, 00:15:17.469 "num_base_bdevs_operational": 3, 00:15:17.469 "base_bdevs_list": [ 00:15:17.469 { 00:15:17.469 "name": "BaseBdev1", 00:15:17.469 "uuid": "7772f8e2-9aaa-4f31-8020-440983344be7", 00:15:17.469 "is_configured": true, 00:15:17.469 "data_offset": 0, 00:15:17.469 "data_size": 65536 00:15:17.469 }, 00:15:17.469 { 00:15:17.469 "name": "BaseBdev2", 00:15:17.469 "uuid": "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad", 00:15:17.469 "is_configured": true, 00:15:17.469 "data_offset": 0, 00:15:17.469 "data_size": 65536 00:15:17.469 }, 00:15:17.469 { 00:15:17.469 "name": "BaseBdev3", 00:15:17.469 "uuid": "d7a4fe0e-6042-47be-9393-e46776aa0ae2", 00:15:17.469 "is_configured": true, 00:15:17.469 "data_offset": 0, 00:15:17.469 "data_size": 65536 00:15:17.469 } 00:15:17.469 ] 00:15:17.469 }' 00:15:17.469 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.469 22:44:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.036 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:18.037 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:18.037 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:18.037 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:18.037 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:18.037 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:18.037 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:18.037 22:44:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:18.296 [2024-07-15 22:44:03.043300] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:18.296 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:18.296 "name": "Existed_Raid", 00:15:18.296 "aliases": [ 00:15:18.296 "d438caa6-53fb-4111-a257-3ccbfd21fc4d" 00:15:18.296 ], 00:15:18.296 "product_name": "Raid Volume", 00:15:18.296 "block_size": 512, 00:15:18.296 "num_blocks": 196608, 00:15:18.296 "uuid": "d438caa6-53fb-4111-a257-3ccbfd21fc4d", 00:15:18.296 "assigned_rate_limits": { 00:15:18.296 "rw_ios_per_sec": 0, 00:15:18.296 "rw_mbytes_per_sec": 0, 00:15:18.296 "r_mbytes_per_sec": 0, 00:15:18.296 "w_mbytes_per_sec": 0 00:15:18.296 }, 00:15:18.296 "claimed": false, 00:15:18.296 "zoned": false, 00:15:18.296 "supported_io_types": { 00:15:18.296 "read": true, 00:15:18.296 "write": true, 00:15:18.296 "unmap": true, 00:15:18.296 "flush": true, 00:15:18.296 "reset": true, 00:15:18.296 "nvme_admin": false, 00:15:18.296 "nvme_io": false, 00:15:18.296 "nvme_io_md": false, 00:15:18.296 "write_zeroes": true, 00:15:18.296 "zcopy": false, 00:15:18.296 "get_zone_info": false, 00:15:18.296 "zone_management": false, 00:15:18.296 "zone_append": false, 00:15:18.296 "compare": false, 00:15:18.296 "compare_and_write": false, 00:15:18.296 "abort": false, 00:15:18.296 "seek_hole": false, 00:15:18.296 "seek_data": false, 00:15:18.296 "copy": false, 00:15:18.296 "nvme_iov_md": false 00:15:18.296 }, 00:15:18.296 "memory_domains": [ 00:15:18.296 { 00:15:18.296 "dma_device_id": "system", 00:15:18.296 "dma_device_type": 1 00:15:18.296 }, 00:15:18.296 { 00:15:18.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.296 "dma_device_type": 2 00:15:18.296 }, 00:15:18.296 { 00:15:18.296 "dma_device_id": "system", 00:15:18.296 "dma_device_type": 1 00:15:18.296 }, 00:15:18.296 { 00:15:18.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.296 "dma_device_type": 2 00:15:18.296 }, 00:15:18.296 { 00:15:18.296 "dma_device_id": "system", 00:15:18.296 "dma_device_type": 1 00:15:18.296 }, 00:15:18.296 { 00:15:18.296 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.296 "dma_device_type": 2 00:15:18.296 } 00:15:18.296 ], 00:15:18.296 "driver_specific": { 00:15:18.296 "raid": { 00:15:18.296 "uuid": "d438caa6-53fb-4111-a257-3ccbfd21fc4d", 00:15:18.296 "strip_size_kb": 64, 00:15:18.296 "state": "online", 00:15:18.296 "raid_level": "concat", 00:15:18.296 "superblock": false, 00:15:18.296 "num_base_bdevs": 3, 00:15:18.296 "num_base_bdevs_discovered": 3, 00:15:18.296 "num_base_bdevs_operational": 3, 00:15:18.296 "base_bdevs_list": [ 00:15:18.296 { 00:15:18.296 "name": "BaseBdev1", 00:15:18.296 "uuid": "7772f8e2-9aaa-4f31-8020-440983344be7", 00:15:18.296 "is_configured": true, 00:15:18.296 "data_offset": 0, 00:15:18.296 "data_size": 65536 00:15:18.296 }, 00:15:18.296 { 00:15:18.296 "name": "BaseBdev2", 00:15:18.296 "uuid": "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad", 00:15:18.296 "is_configured": true, 00:15:18.296 "data_offset": 0, 00:15:18.296 "data_size": 65536 00:15:18.296 }, 00:15:18.296 { 00:15:18.296 "name": "BaseBdev3", 00:15:18.296 "uuid": "d7a4fe0e-6042-47be-9393-e46776aa0ae2", 00:15:18.296 "is_configured": true, 00:15:18.296 "data_offset": 0, 00:15:18.296 "data_size": 65536 00:15:18.296 } 00:15:18.296 ] 00:15:18.296 } 00:15:18.296 } 00:15:18.296 }' 00:15:18.296 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:18.296 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:18.296 BaseBdev2 00:15:18.296 BaseBdev3' 00:15:18.296 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:18.296 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:18.296 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:18.554 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:18.554 "name": "BaseBdev1", 00:15:18.554 "aliases": [ 00:15:18.554 "7772f8e2-9aaa-4f31-8020-440983344be7" 00:15:18.554 ], 00:15:18.554 "product_name": "Malloc disk", 00:15:18.554 "block_size": 512, 00:15:18.554 "num_blocks": 65536, 00:15:18.554 "uuid": "7772f8e2-9aaa-4f31-8020-440983344be7", 00:15:18.554 "assigned_rate_limits": { 00:15:18.554 "rw_ios_per_sec": 0, 00:15:18.554 "rw_mbytes_per_sec": 0, 00:15:18.554 "r_mbytes_per_sec": 0, 00:15:18.554 "w_mbytes_per_sec": 0 00:15:18.554 }, 00:15:18.554 "claimed": true, 00:15:18.554 "claim_type": "exclusive_write", 00:15:18.554 "zoned": false, 00:15:18.554 "supported_io_types": { 00:15:18.554 "read": true, 00:15:18.554 "write": true, 00:15:18.554 "unmap": true, 00:15:18.554 "flush": true, 00:15:18.554 "reset": true, 00:15:18.554 "nvme_admin": false, 00:15:18.554 "nvme_io": false, 00:15:18.554 "nvme_io_md": false, 00:15:18.554 "write_zeroes": true, 00:15:18.554 "zcopy": true, 00:15:18.554 "get_zone_info": false, 00:15:18.554 "zone_management": false, 00:15:18.554 "zone_append": false, 00:15:18.554 "compare": false, 00:15:18.554 "compare_and_write": false, 00:15:18.554 "abort": true, 00:15:18.554 "seek_hole": false, 00:15:18.554 "seek_data": false, 00:15:18.554 "copy": true, 00:15:18.554 "nvme_iov_md": false 00:15:18.554 }, 00:15:18.554 "memory_domains": [ 00:15:18.554 { 00:15:18.554 "dma_device_id": "system", 00:15:18.554 "dma_device_type": 1 00:15:18.554 }, 00:15:18.554 { 00:15:18.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.554 "dma_device_type": 2 00:15:18.554 } 00:15:18.554 ], 00:15:18.554 "driver_specific": {} 00:15:18.554 }' 00:15:18.554 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:18.554 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:18.554 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:18.554 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:18.812 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:19.071 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:19.071 "name": "BaseBdev2", 00:15:19.071 "aliases": [ 00:15:19.071 "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad" 00:15:19.071 ], 00:15:19.071 "product_name": "Malloc disk", 00:15:19.071 "block_size": 512, 00:15:19.071 "num_blocks": 65536, 00:15:19.071 "uuid": "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad", 00:15:19.071 "assigned_rate_limits": { 00:15:19.071 "rw_ios_per_sec": 0, 00:15:19.071 "rw_mbytes_per_sec": 0, 00:15:19.071 "r_mbytes_per_sec": 0, 00:15:19.071 "w_mbytes_per_sec": 0 00:15:19.071 }, 00:15:19.071 "claimed": true, 00:15:19.071 "claim_type": "exclusive_write", 00:15:19.071 "zoned": false, 00:15:19.071 "supported_io_types": { 00:15:19.071 "read": true, 00:15:19.071 "write": true, 00:15:19.071 "unmap": true, 00:15:19.071 "flush": true, 00:15:19.071 "reset": true, 00:15:19.071 "nvme_admin": false, 00:15:19.071 "nvme_io": false, 00:15:19.071 "nvme_io_md": false, 00:15:19.071 "write_zeroes": true, 00:15:19.071 "zcopy": true, 00:15:19.071 "get_zone_info": false, 00:15:19.071 "zone_management": false, 00:15:19.071 "zone_append": false, 00:15:19.071 "compare": false, 00:15:19.071 "compare_and_write": false, 00:15:19.071 "abort": true, 00:15:19.071 "seek_hole": false, 00:15:19.071 "seek_data": false, 00:15:19.071 "copy": true, 00:15:19.071 "nvme_iov_md": false 00:15:19.071 }, 00:15:19.071 "memory_domains": [ 00:15:19.071 { 00:15:19.071 "dma_device_id": "system", 00:15:19.071 "dma_device_type": 1 00:15:19.071 }, 00:15:19.071 { 00:15:19.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.071 "dma_device_type": 2 00:15:19.071 } 00:15:19.071 ], 00:15:19.071 "driver_specific": {} 00:15:19.071 }' 00:15:19.071 22:44:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:19.330 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.588 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.588 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:19.588 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:19.588 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:19.588 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:19.847 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:19.847 "name": "BaseBdev3", 00:15:19.848 "aliases": [ 00:15:19.848 "d7a4fe0e-6042-47be-9393-e46776aa0ae2" 00:15:19.848 ], 00:15:19.848 "product_name": "Malloc disk", 00:15:19.848 "block_size": 512, 00:15:19.848 "num_blocks": 65536, 00:15:19.848 "uuid": "d7a4fe0e-6042-47be-9393-e46776aa0ae2", 00:15:19.848 "assigned_rate_limits": { 00:15:19.848 "rw_ios_per_sec": 0, 00:15:19.848 "rw_mbytes_per_sec": 0, 00:15:19.848 "r_mbytes_per_sec": 0, 00:15:19.848 "w_mbytes_per_sec": 0 00:15:19.848 }, 00:15:19.848 "claimed": true, 00:15:19.848 "claim_type": "exclusive_write", 00:15:19.848 "zoned": false, 00:15:19.848 "supported_io_types": { 00:15:19.848 "read": true, 00:15:19.848 "write": true, 00:15:19.848 "unmap": true, 00:15:19.848 "flush": true, 00:15:19.848 "reset": true, 00:15:19.848 "nvme_admin": false, 00:15:19.848 "nvme_io": false, 00:15:19.848 "nvme_io_md": false, 00:15:19.848 "write_zeroes": true, 00:15:19.848 "zcopy": true, 00:15:19.848 "get_zone_info": false, 00:15:19.848 "zone_management": false, 00:15:19.848 "zone_append": false, 00:15:19.848 "compare": false, 00:15:19.848 "compare_and_write": false, 00:15:19.848 "abort": true, 00:15:19.848 "seek_hole": false, 00:15:19.848 "seek_data": false, 00:15:19.848 "copy": true, 00:15:19.848 "nvme_iov_md": false 00:15:19.848 }, 00:15:19.848 "memory_domains": [ 00:15:19.848 { 00:15:19.848 "dma_device_id": "system", 00:15:19.848 "dma_device_type": 1 00:15:19.848 }, 00:15:19.848 { 00:15:19.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.848 "dma_device_type": 2 00:15:19.848 } 00:15:19.848 ], 00:15:19.848 "driver_specific": {} 00:15:19.848 }' 00:15:19.848 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.848 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:19.848 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:19.848 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.848 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:19.848 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:19.848 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.106 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:20.106 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:20.106 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.106 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:20.106 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:20.106 22:44:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:20.365 [2024-07-15 22:44:05.144623] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:20.365 [2024-07-15 22:44:05.144655] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:20.365 [2024-07-15 22:44:05.144696] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.365 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.624 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.624 "name": "Existed_Raid", 00:15:20.624 "uuid": "d438caa6-53fb-4111-a257-3ccbfd21fc4d", 00:15:20.624 "strip_size_kb": 64, 00:15:20.625 "state": "offline", 00:15:20.625 "raid_level": "concat", 00:15:20.625 "superblock": false, 00:15:20.625 "num_base_bdevs": 3, 00:15:20.625 "num_base_bdevs_discovered": 2, 00:15:20.625 "num_base_bdevs_operational": 2, 00:15:20.625 "base_bdevs_list": [ 00:15:20.625 { 00:15:20.625 "name": null, 00:15:20.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:20.625 "is_configured": false, 00:15:20.625 "data_offset": 0, 00:15:20.625 "data_size": 65536 00:15:20.625 }, 00:15:20.625 { 00:15:20.625 "name": "BaseBdev2", 00:15:20.625 "uuid": "bdffe747-30d5-4bf9-a3ef-4c33a9c8c0ad", 00:15:20.625 "is_configured": true, 00:15:20.625 "data_offset": 0, 00:15:20.625 "data_size": 65536 00:15:20.625 }, 00:15:20.625 { 00:15:20.625 "name": "BaseBdev3", 00:15:20.625 "uuid": "d7a4fe0e-6042-47be-9393-e46776aa0ae2", 00:15:20.625 "is_configured": true, 00:15:20.625 "data_offset": 0, 00:15:20.625 "data_size": 65536 00:15:20.625 } 00:15:20.625 ] 00:15:20.625 }' 00:15:20.625 22:44:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.625 22:44:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.192 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:21.192 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:21.192 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.192 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:21.450 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:21.450 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:21.450 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:21.709 [2024-07-15 22:44:06.482111] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:21.709 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:21.709 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:21.709 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.709 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:21.969 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:21.969 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:21.969 22:44:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:22.228 [2024-07-15 22:44:06.987883] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:22.228 [2024-07-15 22:44:06.987936] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dcf400 name Existed_Raid, state offline 00:15:22.228 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:22.228 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:22.228 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.228 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:22.486 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:22.486 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:22.486 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:22.486 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:22.487 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:22.487 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:22.745 BaseBdev2 00:15:22.745 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:22.745 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:22.745 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:22.745 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:22.745 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:22.745 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:22.745 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:23.004 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:23.263 [ 00:15:23.263 { 00:15:23.263 "name": "BaseBdev2", 00:15:23.263 "aliases": [ 00:15:23.263 "6b0b73ca-eea4-4033-b98c-418135603511" 00:15:23.263 ], 00:15:23.263 "product_name": "Malloc disk", 00:15:23.263 "block_size": 512, 00:15:23.263 "num_blocks": 65536, 00:15:23.263 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:23.263 "assigned_rate_limits": { 00:15:23.263 "rw_ios_per_sec": 0, 00:15:23.263 "rw_mbytes_per_sec": 0, 00:15:23.263 "r_mbytes_per_sec": 0, 00:15:23.263 "w_mbytes_per_sec": 0 00:15:23.263 }, 00:15:23.263 "claimed": false, 00:15:23.263 "zoned": false, 00:15:23.263 "supported_io_types": { 00:15:23.263 "read": true, 00:15:23.263 "write": true, 00:15:23.263 "unmap": true, 00:15:23.263 "flush": true, 00:15:23.263 "reset": true, 00:15:23.263 "nvme_admin": false, 00:15:23.263 "nvme_io": false, 00:15:23.263 "nvme_io_md": false, 00:15:23.263 "write_zeroes": true, 00:15:23.263 "zcopy": true, 00:15:23.263 "get_zone_info": false, 00:15:23.263 "zone_management": false, 00:15:23.263 "zone_append": false, 00:15:23.263 "compare": false, 00:15:23.263 "compare_and_write": false, 00:15:23.263 "abort": true, 00:15:23.263 "seek_hole": false, 00:15:23.263 "seek_data": false, 00:15:23.263 "copy": true, 00:15:23.263 "nvme_iov_md": false 00:15:23.263 }, 00:15:23.263 "memory_domains": [ 00:15:23.263 { 00:15:23.263 "dma_device_id": "system", 00:15:23.263 "dma_device_type": 1 00:15:23.263 }, 00:15:23.263 { 00:15:23.263 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.263 "dma_device_type": 2 00:15:23.263 } 00:15:23.263 ], 00:15:23.263 "driver_specific": {} 00:15:23.263 } 00:15:23.263 ] 00:15:23.263 22:44:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:23.263 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:23.263 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:23.263 22:44:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:23.522 BaseBdev3 00:15:23.522 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:23.522 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:23.522 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:23.522 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:23.522 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:23.522 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:23.522 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:23.780 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:24.039 [ 00:15:24.039 { 00:15:24.039 "name": "BaseBdev3", 00:15:24.039 "aliases": [ 00:15:24.039 "5c30885b-f855-4d25-8a1f-0a7c11aecad3" 00:15:24.039 ], 00:15:24.039 "product_name": "Malloc disk", 00:15:24.039 "block_size": 512, 00:15:24.039 "num_blocks": 65536, 00:15:24.039 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:24.039 "assigned_rate_limits": { 00:15:24.039 "rw_ios_per_sec": 0, 00:15:24.039 "rw_mbytes_per_sec": 0, 00:15:24.039 "r_mbytes_per_sec": 0, 00:15:24.039 "w_mbytes_per_sec": 0 00:15:24.039 }, 00:15:24.039 "claimed": false, 00:15:24.039 "zoned": false, 00:15:24.039 "supported_io_types": { 00:15:24.039 "read": true, 00:15:24.039 "write": true, 00:15:24.039 "unmap": true, 00:15:24.039 "flush": true, 00:15:24.039 "reset": true, 00:15:24.039 "nvme_admin": false, 00:15:24.039 "nvme_io": false, 00:15:24.039 "nvme_io_md": false, 00:15:24.039 "write_zeroes": true, 00:15:24.039 "zcopy": true, 00:15:24.039 "get_zone_info": false, 00:15:24.039 "zone_management": false, 00:15:24.039 "zone_append": false, 00:15:24.039 "compare": false, 00:15:24.039 "compare_and_write": false, 00:15:24.039 "abort": true, 00:15:24.039 "seek_hole": false, 00:15:24.039 "seek_data": false, 00:15:24.039 "copy": true, 00:15:24.039 "nvme_iov_md": false 00:15:24.039 }, 00:15:24.039 "memory_domains": [ 00:15:24.039 { 00:15:24.039 "dma_device_id": "system", 00:15:24.039 "dma_device_type": 1 00:15:24.039 }, 00:15:24.039 { 00:15:24.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.039 "dma_device_type": 2 00:15:24.039 } 00:15:24.039 ], 00:15:24.039 "driver_specific": {} 00:15:24.039 } 00:15:24.039 ] 00:15:24.039 22:44:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:24.039 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:24.039 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:24.039 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:24.298 [2024-07-15 22:44:08.967720] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:24.298 [2024-07-15 22:44:08.967764] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:24.298 [2024-07-15 22:44:08.967782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:24.298 [2024-07-15 22:44:08.969129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.298 22:44:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.556 22:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.556 "name": "Existed_Raid", 00:15:24.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.556 "strip_size_kb": 64, 00:15:24.556 "state": "configuring", 00:15:24.556 "raid_level": "concat", 00:15:24.556 "superblock": false, 00:15:24.556 "num_base_bdevs": 3, 00:15:24.556 "num_base_bdevs_discovered": 2, 00:15:24.556 "num_base_bdevs_operational": 3, 00:15:24.556 "base_bdevs_list": [ 00:15:24.556 { 00:15:24.556 "name": "BaseBdev1", 00:15:24.556 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.556 "is_configured": false, 00:15:24.556 "data_offset": 0, 00:15:24.556 "data_size": 0 00:15:24.556 }, 00:15:24.556 { 00:15:24.556 "name": "BaseBdev2", 00:15:24.556 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:24.556 "is_configured": true, 00:15:24.556 "data_offset": 0, 00:15:24.556 "data_size": 65536 00:15:24.556 }, 00:15:24.556 { 00:15:24.556 "name": "BaseBdev3", 00:15:24.556 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:24.556 "is_configured": true, 00:15:24.556 "data_offset": 0, 00:15:24.556 "data_size": 65536 00:15:24.556 } 00:15:24.556 ] 00:15:24.556 }' 00:15:24.556 22:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.556 22:44:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.122 22:44:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:25.380 [2024-07-15 22:44:10.046569] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.380 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.638 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.639 "name": "Existed_Raid", 00:15:25.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.639 "strip_size_kb": 64, 00:15:25.639 "state": "configuring", 00:15:25.639 "raid_level": "concat", 00:15:25.639 "superblock": false, 00:15:25.639 "num_base_bdevs": 3, 00:15:25.639 "num_base_bdevs_discovered": 1, 00:15:25.639 "num_base_bdevs_operational": 3, 00:15:25.639 "base_bdevs_list": [ 00:15:25.639 { 00:15:25.639 "name": "BaseBdev1", 00:15:25.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.639 "is_configured": false, 00:15:25.639 "data_offset": 0, 00:15:25.639 "data_size": 0 00:15:25.639 }, 00:15:25.639 { 00:15:25.639 "name": null, 00:15:25.639 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:25.639 "is_configured": false, 00:15:25.639 "data_offset": 0, 00:15:25.639 "data_size": 65536 00:15:25.639 }, 00:15:25.639 { 00:15:25.639 "name": "BaseBdev3", 00:15:25.639 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:25.639 "is_configured": true, 00:15:25.639 "data_offset": 0, 00:15:25.639 "data_size": 65536 00:15:25.639 } 00:15:25.639 ] 00:15:25.639 }' 00:15:25.639 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.639 22:44:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:26.207 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.207 22:44:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:26.465 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:26.465 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:26.761 [2024-07-15 22:44:11.414791] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:26.761 BaseBdev1 00:15:26.761 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:26.761 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:26.761 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:26.761 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:26.761 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:26.761 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:26.761 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:27.021 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:27.021 [ 00:15:27.021 { 00:15:27.021 "name": "BaseBdev1", 00:15:27.021 "aliases": [ 00:15:27.021 "03dd3086-9b75-43b1-a55a-8da5a962bf0f" 00:15:27.021 ], 00:15:27.021 "product_name": "Malloc disk", 00:15:27.021 "block_size": 512, 00:15:27.021 "num_blocks": 65536, 00:15:27.021 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:27.021 "assigned_rate_limits": { 00:15:27.021 "rw_ios_per_sec": 0, 00:15:27.021 "rw_mbytes_per_sec": 0, 00:15:27.021 "r_mbytes_per_sec": 0, 00:15:27.021 "w_mbytes_per_sec": 0 00:15:27.021 }, 00:15:27.021 "claimed": true, 00:15:27.021 "claim_type": "exclusive_write", 00:15:27.021 "zoned": false, 00:15:27.021 "supported_io_types": { 00:15:27.021 "read": true, 00:15:27.021 "write": true, 00:15:27.021 "unmap": true, 00:15:27.021 "flush": true, 00:15:27.021 "reset": true, 00:15:27.021 "nvme_admin": false, 00:15:27.021 "nvme_io": false, 00:15:27.021 "nvme_io_md": false, 00:15:27.021 "write_zeroes": true, 00:15:27.021 "zcopy": true, 00:15:27.021 "get_zone_info": false, 00:15:27.021 "zone_management": false, 00:15:27.021 "zone_append": false, 00:15:27.021 "compare": false, 00:15:27.021 "compare_and_write": false, 00:15:27.021 "abort": true, 00:15:27.021 "seek_hole": false, 00:15:27.021 "seek_data": false, 00:15:27.021 "copy": true, 00:15:27.021 "nvme_iov_md": false 00:15:27.021 }, 00:15:27.021 "memory_domains": [ 00:15:27.021 { 00:15:27.021 "dma_device_id": "system", 00:15:27.021 "dma_device_type": 1 00:15:27.021 }, 00:15:27.021 { 00:15:27.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.021 "dma_device_type": 2 00:15:27.021 } 00:15:27.021 ], 00:15:27.021 "driver_specific": {} 00:15:27.021 } 00:15:27.021 ] 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.280 22:44:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.539 22:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.539 "name": "Existed_Raid", 00:15:27.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.539 "strip_size_kb": 64, 00:15:27.539 "state": "configuring", 00:15:27.539 "raid_level": "concat", 00:15:27.539 "superblock": false, 00:15:27.539 "num_base_bdevs": 3, 00:15:27.539 "num_base_bdevs_discovered": 2, 00:15:27.539 "num_base_bdevs_operational": 3, 00:15:27.539 "base_bdevs_list": [ 00:15:27.539 { 00:15:27.539 "name": "BaseBdev1", 00:15:27.539 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:27.539 "is_configured": true, 00:15:27.539 "data_offset": 0, 00:15:27.539 "data_size": 65536 00:15:27.539 }, 00:15:27.539 { 00:15:27.539 "name": null, 00:15:27.539 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:27.539 "is_configured": false, 00:15:27.539 "data_offset": 0, 00:15:27.539 "data_size": 65536 00:15:27.539 }, 00:15:27.539 { 00:15:27.539 "name": "BaseBdev3", 00:15:27.539 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:27.539 "is_configured": true, 00:15:27.539 "data_offset": 0, 00:15:27.539 "data_size": 65536 00:15:27.539 } 00:15:27.539 ] 00:15:27.539 }' 00:15:27.539 22:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.539 22:44:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:28.104 22:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.104 22:44:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:28.363 [2024-07-15 22:44:13.247699] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.363 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.620 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.620 "name": "Existed_Raid", 00:15:28.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.620 "strip_size_kb": 64, 00:15:28.620 "state": "configuring", 00:15:28.620 "raid_level": "concat", 00:15:28.620 "superblock": false, 00:15:28.620 "num_base_bdevs": 3, 00:15:28.620 "num_base_bdevs_discovered": 1, 00:15:28.620 "num_base_bdevs_operational": 3, 00:15:28.620 "base_bdevs_list": [ 00:15:28.620 { 00:15:28.620 "name": "BaseBdev1", 00:15:28.620 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:28.621 "is_configured": true, 00:15:28.621 "data_offset": 0, 00:15:28.621 "data_size": 65536 00:15:28.621 }, 00:15:28.621 { 00:15:28.621 "name": null, 00:15:28.621 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:28.621 "is_configured": false, 00:15:28.621 "data_offset": 0, 00:15:28.621 "data_size": 65536 00:15:28.621 }, 00:15:28.621 { 00:15:28.621 "name": null, 00:15:28.621 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:28.621 "is_configured": false, 00:15:28.621 "data_offset": 0, 00:15:28.621 "data_size": 65536 00:15:28.621 } 00:15:28.621 ] 00:15:28.621 }' 00:15:28.621 22:44:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.621 22:44:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.555 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.555 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:29.555 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:29.555 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:29.817 [2024-07-15 22:44:14.571236] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.817 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.818 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.818 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.818 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.076 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.076 "name": "Existed_Raid", 00:15:30.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.076 "strip_size_kb": 64, 00:15:30.076 "state": "configuring", 00:15:30.076 "raid_level": "concat", 00:15:30.076 "superblock": false, 00:15:30.076 "num_base_bdevs": 3, 00:15:30.076 "num_base_bdevs_discovered": 2, 00:15:30.076 "num_base_bdevs_operational": 3, 00:15:30.076 "base_bdevs_list": [ 00:15:30.076 { 00:15:30.076 "name": "BaseBdev1", 00:15:30.076 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:30.076 "is_configured": true, 00:15:30.076 "data_offset": 0, 00:15:30.076 "data_size": 65536 00:15:30.076 }, 00:15:30.076 { 00:15:30.077 "name": null, 00:15:30.077 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:30.077 "is_configured": false, 00:15:30.077 "data_offset": 0, 00:15:30.077 "data_size": 65536 00:15:30.077 }, 00:15:30.077 { 00:15:30.077 "name": "BaseBdev3", 00:15:30.077 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:30.077 "is_configured": true, 00:15:30.077 "data_offset": 0, 00:15:30.077 "data_size": 65536 00:15:30.077 } 00:15:30.077 ] 00:15:30.077 }' 00:15:30.077 22:44:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.077 22:44:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.643 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.643 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:30.901 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:30.901 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:31.159 [2024-07-15 22:44:15.886743] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.159 22:44:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.419 22:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.419 "name": "Existed_Raid", 00:15:31.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.419 "strip_size_kb": 64, 00:15:31.419 "state": "configuring", 00:15:31.419 "raid_level": "concat", 00:15:31.419 "superblock": false, 00:15:31.419 "num_base_bdevs": 3, 00:15:31.419 "num_base_bdevs_discovered": 1, 00:15:31.419 "num_base_bdevs_operational": 3, 00:15:31.419 "base_bdevs_list": [ 00:15:31.419 { 00:15:31.419 "name": null, 00:15:31.419 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:31.419 "is_configured": false, 00:15:31.419 "data_offset": 0, 00:15:31.419 "data_size": 65536 00:15:31.419 }, 00:15:31.419 { 00:15:31.419 "name": null, 00:15:31.419 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:31.419 "is_configured": false, 00:15:31.419 "data_offset": 0, 00:15:31.419 "data_size": 65536 00:15:31.419 }, 00:15:31.419 { 00:15:31.419 "name": "BaseBdev3", 00:15:31.419 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:31.419 "is_configured": true, 00:15:31.419 "data_offset": 0, 00:15:31.419 "data_size": 65536 00:15:31.419 } 00:15:31.419 ] 00:15:31.419 }' 00:15:31.419 22:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.419 22:44:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.985 22:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:31.985 22:44:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.248 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:32.248 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:32.507 [2024-07-15 22:44:17.246780] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.507 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.766 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.766 "name": "Existed_Raid", 00:15:32.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.766 "strip_size_kb": 64, 00:15:32.766 "state": "configuring", 00:15:32.766 "raid_level": "concat", 00:15:32.766 "superblock": false, 00:15:32.766 "num_base_bdevs": 3, 00:15:32.766 "num_base_bdevs_discovered": 2, 00:15:32.766 "num_base_bdevs_operational": 3, 00:15:32.766 "base_bdevs_list": [ 00:15:32.766 { 00:15:32.766 "name": null, 00:15:32.766 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:32.766 "is_configured": false, 00:15:32.766 "data_offset": 0, 00:15:32.766 "data_size": 65536 00:15:32.766 }, 00:15:32.766 { 00:15:32.766 "name": "BaseBdev2", 00:15:32.766 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:32.766 "is_configured": true, 00:15:32.766 "data_offset": 0, 00:15:32.766 "data_size": 65536 00:15:32.766 }, 00:15:32.766 { 00:15:32.766 "name": "BaseBdev3", 00:15:32.766 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:32.766 "is_configured": true, 00:15:32.766 "data_offset": 0, 00:15:32.766 "data_size": 65536 00:15:32.766 } 00:15:32.766 ] 00:15:32.766 }' 00:15:32.766 22:44:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.766 22:44:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.333 22:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.333 22:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:33.592 22:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:33.592 22:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.592 22:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:33.852 22:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 03dd3086-9b75-43b1-a55a-8da5a962bf0f 00:15:34.111 [2024-07-15 22:44:18.847633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:34.111 [2024-07-15 22:44:18.847675] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1dcd450 00:15:34.111 [2024-07-15 22:44:18.847683] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:34.111 [2024-07-15 22:44:18.847881] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1dceed0 00:15:34.111 [2024-07-15 22:44:18.848010] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1dcd450 00:15:34.111 [2024-07-15 22:44:18.848021] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1dcd450 00:15:34.111 [2024-07-15 22:44:18.848192] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:34.111 NewBaseBdev 00:15:34.111 22:44:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:34.111 22:44:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:34.111 22:44:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:34.111 22:44:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:34.111 22:44:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:34.111 22:44:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:34.111 22:44:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:34.370 22:44:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:34.629 [ 00:15:34.629 { 00:15:34.629 "name": "NewBaseBdev", 00:15:34.629 "aliases": [ 00:15:34.629 "03dd3086-9b75-43b1-a55a-8da5a962bf0f" 00:15:34.629 ], 00:15:34.629 "product_name": "Malloc disk", 00:15:34.629 "block_size": 512, 00:15:34.629 "num_blocks": 65536, 00:15:34.629 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:34.629 "assigned_rate_limits": { 00:15:34.629 "rw_ios_per_sec": 0, 00:15:34.629 "rw_mbytes_per_sec": 0, 00:15:34.629 "r_mbytes_per_sec": 0, 00:15:34.629 "w_mbytes_per_sec": 0 00:15:34.629 }, 00:15:34.629 "claimed": true, 00:15:34.629 "claim_type": "exclusive_write", 00:15:34.629 "zoned": false, 00:15:34.629 "supported_io_types": { 00:15:34.629 "read": true, 00:15:34.629 "write": true, 00:15:34.629 "unmap": true, 00:15:34.629 "flush": true, 00:15:34.629 "reset": true, 00:15:34.629 "nvme_admin": false, 00:15:34.629 "nvme_io": false, 00:15:34.629 "nvme_io_md": false, 00:15:34.629 "write_zeroes": true, 00:15:34.629 "zcopy": true, 00:15:34.629 "get_zone_info": false, 00:15:34.629 "zone_management": false, 00:15:34.629 "zone_append": false, 00:15:34.629 "compare": false, 00:15:34.629 "compare_and_write": false, 00:15:34.629 "abort": true, 00:15:34.629 "seek_hole": false, 00:15:34.629 "seek_data": false, 00:15:34.629 "copy": true, 00:15:34.629 "nvme_iov_md": false 00:15:34.629 }, 00:15:34.629 "memory_domains": [ 00:15:34.629 { 00:15:34.629 "dma_device_id": "system", 00:15:34.629 "dma_device_type": 1 00:15:34.629 }, 00:15:34.629 { 00:15:34.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.629 "dma_device_type": 2 00:15:34.629 } 00:15:34.629 ], 00:15:34.629 "driver_specific": {} 00:15:34.629 } 00:15:34.629 ] 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.629 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.888 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.888 "name": "Existed_Raid", 00:15:34.888 "uuid": "275f913d-67cf-47cd-b2fa-5d85d2f088fa", 00:15:34.888 "strip_size_kb": 64, 00:15:34.888 "state": "online", 00:15:34.888 "raid_level": "concat", 00:15:34.888 "superblock": false, 00:15:34.888 "num_base_bdevs": 3, 00:15:34.888 "num_base_bdevs_discovered": 3, 00:15:34.888 "num_base_bdevs_operational": 3, 00:15:34.888 "base_bdevs_list": [ 00:15:34.888 { 00:15:34.888 "name": "NewBaseBdev", 00:15:34.888 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:34.888 "is_configured": true, 00:15:34.888 "data_offset": 0, 00:15:34.888 "data_size": 65536 00:15:34.888 }, 00:15:34.888 { 00:15:34.888 "name": "BaseBdev2", 00:15:34.888 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:34.888 "is_configured": true, 00:15:34.888 "data_offset": 0, 00:15:34.888 "data_size": 65536 00:15:34.888 }, 00:15:34.888 { 00:15:34.888 "name": "BaseBdev3", 00:15:34.888 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:34.888 "is_configured": true, 00:15:34.888 "data_offset": 0, 00:15:34.888 "data_size": 65536 00:15:34.888 } 00:15:34.888 ] 00:15:34.888 }' 00:15:34.888 22:44:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.888 22:44:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:35.457 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:35.457 [2024-07-15 22:44:20.363957] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:35.716 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:35.716 "name": "Existed_Raid", 00:15:35.716 "aliases": [ 00:15:35.716 "275f913d-67cf-47cd-b2fa-5d85d2f088fa" 00:15:35.716 ], 00:15:35.716 "product_name": "Raid Volume", 00:15:35.716 "block_size": 512, 00:15:35.716 "num_blocks": 196608, 00:15:35.716 "uuid": "275f913d-67cf-47cd-b2fa-5d85d2f088fa", 00:15:35.716 "assigned_rate_limits": { 00:15:35.716 "rw_ios_per_sec": 0, 00:15:35.716 "rw_mbytes_per_sec": 0, 00:15:35.716 "r_mbytes_per_sec": 0, 00:15:35.716 "w_mbytes_per_sec": 0 00:15:35.716 }, 00:15:35.716 "claimed": false, 00:15:35.716 "zoned": false, 00:15:35.716 "supported_io_types": { 00:15:35.716 "read": true, 00:15:35.716 "write": true, 00:15:35.716 "unmap": true, 00:15:35.716 "flush": true, 00:15:35.716 "reset": true, 00:15:35.716 "nvme_admin": false, 00:15:35.716 "nvme_io": false, 00:15:35.716 "nvme_io_md": false, 00:15:35.716 "write_zeroes": true, 00:15:35.716 "zcopy": false, 00:15:35.716 "get_zone_info": false, 00:15:35.716 "zone_management": false, 00:15:35.716 "zone_append": false, 00:15:35.716 "compare": false, 00:15:35.716 "compare_and_write": false, 00:15:35.716 "abort": false, 00:15:35.716 "seek_hole": false, 00:15:35.716 "seek_data": false, 00:15:35.716 "copy": false, 00:15:35.716 "nvme_iov_md": false 00:15:35.716 }, 00:15:35.716 "memory_domains": [ 00:15:35.716 { 00:15:35.716 "dma_device_id": "system", 00:15:35.716 "dma_device_type": 1 00:15:35.716 }, 00:15:35.716 { 00:15:35.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.716 "dma_device_type": 2 00:15:35.716 }, 00:15:35.716 { 00:15:35.716 "dma_device_id": "system", 00:15:35.716 "dma_device_type": 1 00:15:35.716 }, 00:15:35.716 { 00:15:35.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.716 "dma_device_type": 2 00:15:35.716 }, 00:15:35.716 { 00:15:35.716 "dma_device_id": "system", 00:15:35.716 "dma_device_type": 1 00:15:35.716 }, 00:15:35.716 { 00:15:35.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.716 "dma_device_type": 2 00:15:35.716 } 00:15:35.716 ], 00:15:35.716 "driver_specific": { 00:15:35.717 "raid": { 00:15:35.717 "uuid": "275f913d-67cf-47cd-b2fa-5d85d2f088fa", 00:15:35.717 "strip_size_kb": 64, 00:15:35.717 "state": "online", 00:15:35.717 "raid_level": "concat", 00:15:35.717 "superblock": false, 00:15:35.717 "num_base_bdevs": 3, 00:15:35.717 "num_base_bdevs_discovered": 3, 00:15:35.717 "num_base_bdevs_operational": 3, 00:15:35.717 "base_bdevs_list": [ 00:15:35.717 { 00:15:35.717 "name": "NewBaseBdev", 00:15:35.717 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:35.717 "is_configured": true, 00:15:35.717 "data_offset": 0, 00:15:35.717 "data_size": 65536 00:15:35.717 }, 00:15:35.717 { 00:15:35.717 "name": "BaseBdev2", 00:15:35.717 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:35.717 "is_configured": true, 00:15:35.717 "data_offset": 0, 00:15:35.717 "data_size": 65536 00:15:35.717 }, 00:15:35.717 { 00:15:35.717 "name": "BaseBdev3", 00:15:35.717 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:35.717 "is_configured": true, 00:15:35.717 "data_offset": 0, 00:15:35.717 "data_size": 65536 00:15:35.717 } 00:15:35.717 ] 00:15:35.717 } 00:15:35.717 } 00:15:35.717 }' 00:15:35.717 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:35.717 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:35.717 BaseBdev2 00:15:35.717 BaseBdev3' 00:15:35.717 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:35.717 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:35.717 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:35.976 "name": "NewBaseBdev", 00:15:35.976 "aliases": [ 00:15:35.976 "03dd3086-9b75-43b1-a55a-8da5a962bf0f" 00:15:35.976 ], 00:15:35.976 "product_name": "Malloc disk", 00:15:35.976 "block_size": 512, 00:15:35.976 "num_blocks": 65536, 00:15:35.976 "uuid": "03dd3086-9b75-43b1-a55a-8da5a962bf0f", 00:15:35.976 "assigned_rate_limits": { 00:15:35.976 "rw_ios_per_sec": 0, 00:15:35.976 "rw_mbytes_per_sec": 0, 00:15:35.976 "r_mbytes_per_sec": 0, 00:15:35.976 "w_mbytes_per_sec": 0 00:15:35.976 }, 00:15:35.976 "claimed": true, 00:15:35.976 "claim_type": "exclusive_write", 00:15:35.976 "zoned": false, 00:15:35.976 "supported_io_types": { 00:15:35.976 "read": true, 00:15:35.976 "write": true, 00:15:35.976 "unmap": true, 00:15:35.976 "flush": true, 00:15:35.976 "reset": true, 00:15:35.976 "nvme_admin": false, 00:15:35.976 "nvme_io": false, 00:15:35.976 "nvme_io_md": false, 00:15:35.976 "write_zeroes": true, 00:15:35.976 "zcopy": true, 00:15:35.976 "get_zone_info": false, 00:15:35.976 "zone_management": false, 00:15:35.976 "zone_append": false, 00:15:35.976 "compare": false, 00:15:35.976 "compare_and_write": false, 00:15:35.976 "abort": true, 00:15:35.976 "seek_hole": false, 00:15:35.976 "seek_data": false, 00:15:35.976 "copy": true, 00:15:35.976 "nvme_iov_md": false 00:15:35.976 }, 00:15:35.976 "memory_domains": [ 00:15:35.976 { 00:15:35.976 "dma_device_id": "system", 00:15:35.976 "dma_device_type": 1 00:15:35.976 }, 00:15:35.976 { 00:15:35.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.976 "dma_device_type": 2 00:15:35.976 } 00:15:35.976 ], 00:15:35.976 "driver_specific": {} 00:15:35.976 }' 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:35.976 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.235 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.235 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:36.235 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.235 22:44:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.235 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:36.235 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:36.235 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:36.235 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:36.494 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:36.494 "name": "BaseBdev2", 00:15:36.494 "aliases": [ 00:15:36.494 "6b0b73ca-eea4-4033-b98c-418135603511" 00:15:36.494 ], 00:15:36.494 "product_name": "Malloc disk", 00:15:36.494 "block_size": 512, 00:15:36.494 "num_blocks": 65536, 00:15:36.494 "uuid": "6b0b73ca-eea4-4033-b98c-418135603511", 00:15:36.494 "assigned_rate_limits": { 00:15:36.494 "rw_ios_per_sec": 0, 00:15:36.494 "rw_mbytes_per_sec": 0, 00:15:36.494 "r_mbytes_per_sec": 0, 00:15:36.494 "w_mbytes_per_sec": 0 00:15:36.494 }, 00:15:36.494 "claimed": true, 00:15:36.494 "claim_type": "exclusive_write", 00:15:36.494 "zoned": false, 00:15:36.494 "supported_io_types": { 00:15:36.494 "read": true, 00:15:36.494 "write": true, 00:15:36.494 "unmap": true, 00:15:36.494 "flush": true, 00:15:36.494 "reset": true, 00:15:36.494 "nvme_admin": false, 00:15:36.494 "nvme_io": false, 00:15:36.494 "nvme_io_md": false, 00:15:36.494 "write_zeroes": true, 00:15:36.494 "zcopy": true, 00:15:36.494 "get_zone_info": false, 00:15:36.494 "zone_management": false, 00:15:36.494 "zone_append": false, 00:15:36.494 "compare": false, 00:15:36.494 "compare_and_write": false, 00:15:36.494 "abort": true, 00:15:36.494 "seek_hole": false, 00:15:36.494 "seek_data": false, 00:15:36.494 "copy": true, 00:15:36.494 "nvme_iov_md": false 00:15:36.494 }, 00:15:36.494 "memory_domains": [ 00:15:36.494 { 00:15:36.494 "dma_device_id": "system", 00:15:36.494 "dma_device_type": 1 00:15:36.494 }, 00:15:36.494 { 00:15:36.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.494 "dma_device_type": 2 00:15:36.494 } 00:15:36.494 ], 00:15:36.494 "driver_specific": {} 00:15:36.494 }' 00:15:36.494 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.494 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.494 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:36.494 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:36.753 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:37.012 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:37.012 "name": "BaseBdev3", 00:15:37.012 "aliases": [ 00:15:37.012 "5c30885b-f855-4d25-8a1f-0a7c11aecad3" 00:15:37.012 ], 00:15:37.012 "product_name": "Malloc disk", 00:15:37.012 "block_size": 512, 00:15:37.012 "num_blocks": 65536, 00:15:37.012 "uuid": "5c30885b-f855-4d25-8a1f-0a7c11aecad3", 00:15:37.012 "assigned_rate_limits": { 00:15:37.012 "rw_ios_per_sec": 0, 00:15:37.012 "rw_mbytes_per_sec": 0, 00:15:37.012 "r_mbytes_per_sec": 0, 00:15:37.012 "w_mbytes_per_sec": 0 00:15:37.012 }, 00:15:37.012 "claimed": true, 00:15:37.012 "claim_type": "exclusive_write", 00:15:37.012 "zoned": false, 00:15:37.012 "supported_io_types": { 00:15:37.012 "read": true, 00:15:37.012 "write": true, 00:15:37.012 "unmap": true, 00:15:37.012 "flush": true, 00:15:37.012 "reset": true, 00:15:37.012 "nvme_admin": false, 00:15:37.012 "nvme_io": false, 00:15:37.012 "nvme_io_md": false, 00:15:37.012 "write_zeroes": true, 00:15:37.012 "zcopy": true, 00:15:37.012 "get_zone_info": false, 00:15:37.012 "zone_management": false, 00:15:37.012 "zone_append": false, 00:15:37.012 "compare": false, 00:15:37.012 "compare_and_write": false, 00:15:37.012 "abort": true, 00:15:37.012 "seek_hole": false, 00:15:37.012 "seek_data": false, 00:15:37.012 "copy": true, 00:15:37.012 "nvme_iov_md": false 00:15:37.012 }, 00:15:37.012 "memory_domains": [ 00:15:37.012 { 00:15:37.012 "dma_device_id": "system", 00:15:37.012 "dma_device_type": 1 00:15:37.012 }, 00:15:37.012 { 00:15:37.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.012 "dma_device_type": 2 00:15:37.012 } 00:15:37.012 ], 00:15:37.012 "driver_specific": {} 00:15:37.012 }' 00:15:37.012 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.270 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.270 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:37.270 22:44:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.270 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.270 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:37.270 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.270 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.270 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:37.270 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.529 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.529 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:37.529 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:37.789 [2024-07-15 22:44:22.465238] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:37.789 [2024-07-15 22:44:22.465263] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:37.789 [2024-07-15 22:44:22.465315] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:37.789 [2024-07-15 22:44:22.465367] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:37.789 [2024-07-15 22:44:22.465379] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1dcd450 name Existed_Raid, state offline 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2732540 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2732540 ']' 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2732540 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2732540 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2732540' 00:15:37.789 killing process with pid 2732540 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2732540 00:15:37.789 [2024-07-15 22:44:22.539451] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:37.789 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2732540 00:15:37.789 [2024-07-15 22:44:22.567195] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:38.049 00:15:38.049 real 0m28.507s 00:15:38.049 user 0m52.221s 00:15:38.049 sys 0m5.178s 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.049 ************************************ 00:15:38.049 END TEST raid_state_function_test 00:15:38.049 ************************************ 00:15:38.049 22:44:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:38.049 22:44:22 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:38.049 22:44:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:38.049 22:44:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:38.049 22:44:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:38.049 ************************************ 00:15:38.049 START TEST raid_state_function_test_sb 00:15:38.049 ************************************ 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2736830 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2736830' 00:15:38.049 Process raid pid: 2736830 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2736830 /var/tmp/spdk-raid.sock 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2736830 ']' 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:38.049 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:38.049 22:44:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:38.049 [2024-07-15 22:44:22.952631] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:15:38.049 [2024-07-15 22:44:22.952712] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:38.309 [2024-07-15 22:44:23.087566] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:38.309 [2024-07-15 22:44:23.193199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:38.567 [2024-07-15 22:44:23.253807] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:38.567 [2024-07-15 22:44:23.253835] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:39.134 22:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:39.134 22:44:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:39.134 22:44:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:39.392 [2024-07-15 22:44:24.044150] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:39.392 [2024-07-15 22:44:24.044196] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:39.392 [2024-07-15 22:44:24.044207] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:39.392 [2024-07-15 22:44:24.044219] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:39.392 [2024-07-15 22:44:24.044228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:39.392 [2024-07-15 22:44:24.044239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.392 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.393 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.393 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.393 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:39.651 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.651 "name": "Existed_Raid", 00:15:39.651 "uuid": "b3cfcbcc-f5d7-42eb-b797-3639fc47095a", 00:15:39.651 "strip_size_kb": 64, 00:15:39.651 "state": "configuring", 00:15:39.651 "raid_level": "concat", 00:15:39.651 "superblock": true, 00:15:39.651 "num_base_bdevs": 3, 00:15:39.651 "num_base_bdevs_discovered": 0, 00:15:39.651 "num_base_bdevs_operational": 3, 00:15:39.651 "base_bdevs_list": [ 00:15:39.651 { 00:15:39.651 "name": "BaseBdev1", 00:15:39.651 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.651 "is_configured": false, 00:15:39.651 "data_offset": 0, 00:15:39.651 "data_size": 0 00:15:39.651 }, 00:15:39.651 { 00:15:39.651 "name": "BaseBdev2", 00:15:39.651 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.651 "is_configured": false, 00:15:39.651 "data_offset": 0, 00:15:39.651 "data_size": 0 00:15:39.651 }, 00:15:39.651 { 00:15:39.651 "name": "BaseBdev3", 00:15:39.651 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.651 "is_configured": false, 00:15:39.651 "data_offset": 0, 00:15:39.651 "data_size": 0 00:15:39.651 } 00:15:39.651 ] 00:15:39.651 }' 00:15:39.651 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.651 22:44:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:40.217 22:44:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:40.475 [2024-07-15 22:44:25.146895] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:40.475 [2024-07-15 22:44:25.146933] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb6a80 name Existed_Raid, state configuring 00:15:40.475 22:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:40.475 [2024-07-15 22:44:25.375531] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:40.475 [2024-07-15 22:44:25.375560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:40.475 [2024-07-15 22:44:25.375570] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:40.475 [2024-07-15 22:44:25.375581] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:40.475 [2024-07-15 22:44:25.375590] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:40.475 [2024-07-15 22:44:25.375601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:40.747 22:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:40.747 [2024-07-15 22:44:25.626398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:40.747 BaseBdev1 00:15:41.026 22:44:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:41.027 22:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:41.027 22:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:41.027 22:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:41.027 22:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:41.027 22:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:41.027 22:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:41.027 22:44:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:41.284 [ 00:15:41.284 { 00:15:41.284 "name": "BaseBdev1", 00:15:41.284 "aliases": [ 00:15:41.284 "f2bd8c86-28cb-47bc-9175-b997b0bcab05" 00:15:41.284 ], 00:15:41.284 "product_name": "Malloc disk", 00:15:41.284 "block_size": 512, 00:15:41.284 "num_blocks": 65536, 00:15:41.284 "uuid": "f2bd8c86-28cb-47bc-9175-b997b0bcab05", 00:15:41.284 "assigned_rate_limits": { 00:15:41.284 "rw_ios_per_sec": 0, 00:15:41.284 "rw_mbytes_per_sec": 0, 00:15:41.284 "r_mbytes_per_sec": 0, 00:15:41.284 "w_mbytes_per_sec": 0 00:15:41.284 }, 00:15:41.284 "claimed": true, 00:15:41.284 "claim_type": "exclusive_write", 00:15:41.284 "zoned": false, 00:15:41.284 "supported_io_types": { 00:15:41.284 "read": true, 00:15:41.284 "write": true, 00:15:41.284 "unmap": true, 00:15:41.284 "flush": true, 00:15:41.284 "reset": true, 00:15:41.284 "nvme_admin": false, 00:15:41.284 "nvme_io": false, 00:15:41.284 "nvme_io_md": false, 00:15:41.284 "write_zeroes": true, 00:15:41.284 "zcopy": true, 00:15:41.284 "get_zone_info": false, 00:15:41.284 "zone_management": false, 00:15:41.284 "zone_append": false, 00:15:41.284 "compare": false, 00:15:41.284 "compare_and_write": false, 00:15:41.284 "abort": true, 00:15:41.284 "seek_hole": false, 00:15:41.284 "seek_data": false, 00:15:41.284 "copy": true, 00:15:41.284 "nvme_iov_md": false 00:15:41.284 }, 00:15:41.284 "memory_domains": [ 00:15:41.284 { 00:15:41.284 "dma_device_id": "system", 00:15:41.284 "dma_device_type": 1 00:15:41.284 }, 00:15:41.284 { 00:15:41.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.284 "dma_device_type": 2 00:15:41.284 } 00:15:41.284 ], 00:15:41.284 "driver_specific": {} 00:15:41.284 } 00:15:41.284 ] 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.284 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:41.583 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.583 "name": "Existed_Raid", 00:15:41.583 "uuid": "55216f4a-f06d-4bd8-bcab-b714f3d3382c", 00:15:41.583 "strip_size_kb": 64, 00:15:41.583 "state": "configuring", 00:15:41.583 "raid_level": "concat", 00:15:41.583 "superblock": true, 00:15:41.583 "num_base_bdevs": 3, 00:15:41.583 "num_base_bdevs_discovered": 1, 00:15:41.583 "num_base_bdevs_operational": 3, 00:15:41.583 "base_bdevs_list": [ 00:15:41.583 { 00:15:41.583 "name": "BaseBdev1", 00:15:41.583 "uuid": "f2bd8c86-28cb-47bc-9175-b997b0bcab05", 00:15:41.583 "is_configured": true, 00:15:41.583 "data_offset": 2048, 00:15:41.583 "data_size": 63488 00:15:41.583 }, 00:15:41.583 { 00:15:41.583 "name": "BaseBdev2", 00:15:41.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.583 "is_configured": false, 00:15:41.583 "data_offset": 0, 00:15:41.583 "data_size": 0 00:15:41.583 }, 00:15:41.583 { 00:15:41.583 "name": "BaseBdev3", 00:15:41.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.583 "is_configured": false, 00:15:41.583 "data_offset": 0, 00:15:41.583 "data_size": 0 00:15:41.583 } 00:15:41.583 ] 00:15:41.583 }' 00:15:41.583 22:44:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.583 22:44:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:42.150 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:42.409 [2024-07-15 22:44:27.234628] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:42.409 [2024-07-15 22:44:27.234674] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb6310 name Existed_Raid, state configuring 00:15:42.409 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:42.668 [2024-07-15 22:44:27.479335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:42.668 [2024-07-15 22:44:27.480835] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:42.668 [2024-07-15 22:44:27.480870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:42.668 [2024-07-15 22:44:27.480880] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:42.668 [2024-07-15 22:44:27.480892] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.668 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:42.927 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.927 "name": "Existed_Raid", 00:15:42.927 "uuid": "040c1e71-e869-4e81-98ee-4432799e8f94", 00:15:42.927 "strip_size_kb": 64, 00:15:42.927 "state": "configuring", 00:15:42.927 "raid_level": "concat", 00:15:42.927 "superblock": true, 00:15:42.927 "num_base_bdevs": 3, 00:15:42.927 "num_base_bdevs_discovered": 1, 00:15:42.927 "num_base_bdevs_operational": 3, 00:15:42.927 "base_bdevs_list": [ 00:15:42.927 { 00:15:42.927 "name": "BaseBdev1", 00:15:42.927 "uuid": "f2bd8c86-28cb-47bc-9175-b997b0bcab05", 00:15:42.927 "is_configured": true, 00:15:42.927 "data_offset": 2048, 00:15:42.927 "data_size": 63488 00:15:42.927 }, 00:15:42.927 { 00:15:42.927 "name": "BaseBdev2", 00:15:42.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:42.927 "is_configured": false, 00:15:42.927 "data_offset": 0, 00:15:42.927 "data_size": 0 00:15:42.927 }, 00:15:42.927 { 00:15:42.927 "name": "BaseBdev3", 00:15:42.927 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:42.927 "is_configured": false, 00:15:42.927 "data_offset": 0, 00:15:42.927 "data_size": 0 00:15:42.927 } 00:15:42.927 ] 00:15:42.927 }' 00:15:42.927 22:44:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.927 22:44:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:43.495 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:43.754 [2024-07-15 22:44:28.513542] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:43.754 BaseBdev2 00:15:43.754 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:43.755 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:43.755 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:43.755 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:43.755 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:43.755 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:43.755 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:44.014 [ 00:15:44.014 { 00:15:44.014 "name": "BaseBdev2", 00:15:44.014 "aliases": [ 00:15:44.014 "d2cd6c98-76c3-484f-a4b5-19d545175236" 00:15:44.014 ], 00:15:44.014 "product_name": "Malloc disk", 00:15:44.014 "block_size": 512, 00:15:44.014 "num_blocks": 65536, 00:15:44.014 "uuid": "d2cd6c98-76c3-484f-a4b5-19d545175236", 00:15:44.014 "assigned_rate_limits": { 00:15:44.014 "rw_ios_per_sec": 0, 00:15:44.014 "rw_mbytes_per_sec": 0, 00:15:44.014 "r_mbytes_per_sec": 0, 00:15:44.014 "w_mbytes_per_sec": 0 00:15:44.014 }, 00:15:44.014 "claimed": true, 00:15:44.014 "claim_type": "exclusive_write", 00:15:44.014 "zoned": false, 00:15:44.014 "supported_io_types": { 00:15:44.014 "read": true, 00:15:44.014 "write": true, 00:15:44.014 "unmap": true, 00:15:44.014 "flush": true, 00:15:44.014 "reset": true, 00:15:44.014 "nvme_admin": false, 00:15:44.014 "nvme_io": false, 00:15:44.014 "nvme_io_md": false, 00:15:44.014 "write_zeroes": true, 00:15:44.014 "zcopy": true, 00:15:44.014 "get_zone_info": false, 00:15:44.014 "zone_management": false, 00:15:44.014 "zone_append": false, 00:15:44.014 "compare": false, 00:15:44.014 "compare_and_write": false, 00:15:44.014 "abort": true, 00:15:44.014 "seek_hole": false, 00:15:44.014 "seek_data": false, 00:15:44.014 "copy": true, 00:15:44.014 "nvme_iov_md": false 00:15:44.014 }, 00:15:44.014 "memory_domains": [ 00:15:44.014 { 00:15:44.014 "dma_device_id": "system", 00:15:44.014 "dma_device_type": 1 00:15:44.014 }, 00:15:44.014 { 00:15:44.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.014 "dma_device_type": 2 00:15:44.014 } 00:15:44.014 ], 00:15:44.014 "driver_specific": {} 00:15:44.014 } 00:15:44.014 ] 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.014 22:44:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.273 22:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.273 "name": "Existed_Raid", 00:15:44.273 "uuid": "040c1e71-e869-4e81-98ee-4432799e8f94", 00:15:44.273 "strip_size_kb": 64, 00:15:44.273 "state": "configuring", 00:15:44.273 "raid_level": "concat", 00:15:44.273 "superblock": true, 00:15:44.273 "num_base_bdevs": 3, 00:15:44.273 "num_base_bdevs_discovered": 2, 00:15:44.273 "num_base_bdevs_operational": 3, 00:15:44.273 "base_bdevs_list": [ 00:15:44.273 { 00:15:44.273 "name": "BaseBdev1", 00:15:44.273 "uuid": "f2bd8c86-28cb-47bc-9175-b997b0bcab05", 00:15:44.273 "is_configured": true, 00:15:44.273 "data_offset": 2048, 00:15:44.273 "data_size": 63488 00:15:44.273 }, 00:15:44.273 { 00:15:44.273 "name": "BaseBdev2", 00:15:44.273 "uuid": "d2cd6c98-76c3-484f-a4b5-19d545175236", 00:15:44.273 "is_configured": true, 00:15:44.273 "data_offset": 2048, 00:15:44.273 "data_size": 63488 00:15:44.273 }, 00:15:44.273 { 00:15:44.273 "name": "BaseBdev3", 00:15:44.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.273 "is_configured": false, 00:15:44.273 "data_offset": 0, 00:15:44.273 "data_size": 0 00:15:44.273 } 00:15:44.273 ] 00:15:44.273 }' 00:15:44.273 22:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.273 22:44:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:44.841 22:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:45.100 [2024-07-15 22:44:29.752336] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:45.100 [2024-07-15 22:44:29.752498] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcb7400 00:15:45.100 [2024-07-15 22:44:29.752512] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:45.100 [2024-07-15 22:44:29.752686] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcb6ef0 00:15:45.100 [2024-07-15 22:44:29.752803] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcb7400 00:15:45.100 [2024-07-15 22:44:29.752813] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcb7400 00:15:45.100 [2024-07-15 22:44:29.752904] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:45.100 BaseBdev3 00:15:45.100 22:44:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:45.100 22:44:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:45.100 22:44:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:45.100 22:44:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:45.100 22:44:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:45.100 22:44:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:45.100 22:44:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:45.359 22:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:45.359 [ 00:15:45.359 { 00:15:45.359 "name": "BaseBdev3", 00:15:45.359 "aliases": [ 00:15:45.359 "4767fdab-9c24-4dc7-8b30-39cba2d44342" 00:15:45.359 ], 00:15:45.359 "product_name": "Malloc disk", 00:15:45.359 "block_size": 512, 00:15:45.359 "num_blocks": 65536, 00:15:45.359 "uuid": "4767fdab-9c24-4dc7-8b30-39cba2d44342", 00:15:45.359 "assigned_rate_limits": { 00:15:45.359 "rw_ios_per_sec": 0, 00:15:45.359 "rw_mbytes_per_sec": 0, 00:15:45.359 "r_mbytes_per_sec": 0, 00:15:45.359 "w_mbytes_per_sec": 0 00:15:45.359 }, 00:15:45.359 "claimed": true, 00:15:45.359 "claim_type": "exclusive_write", 00:15:45.359 "zoned": false, 00:15:45.359 "supported_io_types": { 00:15:45.359 "read": true, 00:15:45.359 "write": true, 00:15:45.359 "unmap": true, 00:15:45.359 "flush": true, 00:15:45.359 "reset": true, 00:15:45.359 "nvme_admin": false, 00:15:45.359 "nvme_io": false, 00:15:45.359 "nvme_io_md": false, 00:15:45.359 "write_zeroes": true, 00:15:45.359 "zcopy": true, 00:15:45.359 "get_zone_info": false, 00:15:45.359 "zone_management": false, 00:15:45.359 "zone_append": false, 00:15:45.360 "compare": false, 00:15:45.360 "compare_and_write": false, 00:15:45.360 "abort": true, 00:15:45.360 "seek_hole": false, 00:15:45.360 "seek_data": false, 00:15:45.360 "copy": true, 00:15:45.360 "nvme_iov_md": false 00:15:45.360 }, 00:15:45.360 "memory_domains": [ 00:15:45.360 { 00:15:45.360 "dma_device_id": "system", 00:15:45.360 "dma_device_type": 1 00:15:45.360 }, 00:15:45.360 { 00:15:45.360 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.360 "dma_device_type": 2 00:15:45.360 } 00:15:45.360 ], 00:15:45.360 "driver_specific": {} 00:15:45.360 } 00:15:45.360 ] 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.619 "name": "Existed_Raid", 00:15:45.619 "uuid": "040c1e71-e869-4e81-98ee-4432799e8f94", 00:15:45.619 "strip_size_kb": 64, 00:15:45.619 "state": "online", 00:15:45.619 "raid_level": "concat", 00:15:45.619 "superblock": true, 00:15:45.619 "num_base_bdevs": 3, 00:15:45.619 "num_base_bdevs_discovered": 3, 00:15:45.619 "num_base_bdevs_operational": 3, 00:15:45.619 "base_bdevs_list": [ 00:15:45.619 { 00:15:45.619 "name": "BaseBdev1", 00:15:45.619 "uuid": "f2bd8c86-28cb-47bc-9175-b997b0bcab05", 00:15:45.619 "is_configured": true, 00:15:45.619 "data_offset": 2048, 00:15:45.619 "data_size": 63488 00:15:45.619 }, 00:15:45.619 { 00:15:45.619 "name": "BaseBdev2", 00:15:45.619 "uuid": "d2cd6c98-76c3-484f-a4b5-19d545175236", 00:15:45.619 "is_configured": true, 00:15:45.619 "data_offset": 2048, 00:15:45.619 "data_size": 63488 00:15:45.619 }, 00:15:45.619 { 00:15:45.619 "name": "BaseBdev3", 00:15:45.619 "uuid": "4767fdab-9c24-4dc7-8b30-39cba2d44342", 00:15:45.619 "is_configured": true, 00:15:45.619 "data_offset": 2048, 00:15:45.619 "data_size": 63488 00:15:45.619 } 00:15:45.619 ] 00:15:45.619 }' 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.619 22:44:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.186 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:46.187 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:46.187 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:46.187 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:46.187 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:46.187 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:46.187 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:46.187 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:46.445 [2024-07-15 22:44:31.300748] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:46.445 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:46.445 "name": "Existed_Raid", 00:15:46.445 "aliases": [ 00:15:46.446 "040c1e71-e869-4e81-98ee-4432799e8f94" 00:15:46.446 ], 00:15:46.446 "product_name": "Raid Volume", 00:15:46.446 "block_size": 512, 00:15:46.446 "num_blocks": 190464, 00:15:46.446 "uuid": "040c1e71-e869-4e81-98ee-4432799e8f94", 00:15:46.446 "assigned_rate_limits": { 00:15:46.446 "rw_ios_per_sec": 0, 00:15:46.446 "rw_mbytes_per_sec": 0, 00:15:46.446 "r_mbytes_per_sec": 0, 00:15:46.446 "w_mbytes_per_sec": 0 00:15:46.446 }, 00:15:46.446 "claimed": false, 00:15:46.446 "zoned": false, 00:15:46.446 "supported_io_types": { 00:15:46.446 "read": true, 00:15:46.446 "write": true, 00:15:46.446 "unmap": true, 00:15:46.446 "flush": true, 00:15:46.446 "reset": true, 00:15:46.446 "nvme_admin": false, 00:15:46.446 "nvme_io": false, 00:15:46.446 "nvme_io_md": false, 00:15:46.446 "write_zeroes": true, 00:15:46.446 "zcopy": false, 00:15:46.446 "get_zone_info": false, 00:15:46.446 "zone_management": false, 00:15:46.446 "zone_append": false, 00:15:46.446 "compare": false, 00:15:46.446 "compare_and_write": false, 00:15:46.446 "abort": false, 00:15:46.446 "seek_hole": false, 00:15:46.446 "seek_data": false, 00:15:46.446 "copy": false, 00:15:46.446 "nvme_iov_md": false 00:15:46.446 }, 00:15:46.446 "memory_domains": [ 00:15:46.446 { 00:15:46.446 "dma_device_id": "system", 00:15:46.446 "dma_device_type": 1 00:15:46.446 }, 00:15:46.446 { 00:15:46.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.446 "dma_device_type": 2 00:15:46.446 }, 00:15:46.446 { 00:15:46.446 "dma_device_id": "system", 00:15:46.446 "dma_device_type": 1 00:15:46.446 }, 00:15:46.446 { 00:15:46.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.446 "dma_device_type": 2 00:15:46.446 }, 00:15:46.446 { 00:15:46.446 "dma_device_id": "system", 00:15:46.446 "dma_device_type": 1 00:15:46.446 }, 00:15:46.446 { 00:15:46.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.446 "dma_device_type": 2 00:15:46.446 } 00:15:46.446 ], 00:15:46.446 "driver_specific": { 00:15:46.446 "raid": { 00:15:46.446 "uuid": "040c1e71-e869-4e81-98ee-4432799e8f94", 00:15:46.446 "strip_size_kb": 64, 00:15:46.446 "state": "online", 00:15:46.446 "raid_level": "concat", 00:15:46.446 "superblock": true, 00:15:46.446 "num_base_bdevs": 3, 00:15:46.446 "num_base_bdevs_discovered": 3, 00:15:46.446 "num_base_bdevs_operational": 3, 00:15:46.446 "base_bdevs_list": [ 00:15:46.446 { 00:15:46.446 "name": "BaseBdev1", 00:15:46.446 "uuid": "f2bd8c86-28cb-47bc-9175-b997b0bcab05", 00:15:46.446 "is_configured": true, 00:15:46.446 "data_offset": 2048, 00:15:46.446 "data_size": 63488 00:15:46.446 }, 00:15:46.446 { 00:15:46.446 "name": "BaseBdev2", 00:15:46.446 "uuid": "d2cd6c98-76c3-484f-a4b5-19d545175236", 00:15:46.446 "is_configured": true, 00:15:46.446 "data_offset": 2048, 00:15:46.446 "data_size": 63488 00:15:46.446 }, 00:15:46.446 { 00:15:46.446 "name": "BaseBdev3", 00:15:46.446 "uuid": "4767fdab-9c24-4dc7-8b30-39cba2d44342", 00:15:46.446 "is_configured": true, 00:15:46.446 "data_offset": 2048, 00:15:46.446 "data_size": 63488 00:15:46.446 } 00:15:46.446 ] 00:15:46.446 } 00:15:46.446 } 00:15:46.446 }' 00:15:46.446 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:46.704 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:46.704 BaseBdev2 00:15:46.704 BaseBdev3' 00:15:46.704 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:46.704 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:46.704 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:46.963 "name": "BaseBdev1", 00:15:46.963 "aliases": [ 00:15:46.963 "f2bd8c86-28cb-47bc-9175-b997b0bcab05" 00:15:46.963 ], 00:15:46.963 "product_name": "Malloc disk", 00:15:46.963 "block_size": 512, 00:15:46.963 "num_blocks": 65536, 00:15:46.963 "uuid": "f2bd8c86-28cb-47bc-9175-b997b0bcab05", 00:15:46.963 "assigned_rate_limits": { 00:15:46.963 "rw_ios_per_sec": 0, 00:15:46.963 "rw_mbytes_per_sec": 0, 00:15:46.963 "r_mbytes_per_sec": 0, 00:15:46.963 "w_mbytes_per_sec": 0 00:15:46.963 }, 00:15:46.963 "claimed": true, 00:15:46.963 "claim_type": "exclusive_write", 00:15:46.963 "zoned": false, 00:15:46.963 "supported_io_types": { 00:15:46.963 "read": true, 00:15:46.963 "write": true, 00:15:46.963 "unmap": true, 00:15:46.963 "flush": true, 00:15:46.963 "reset": true, 00:15:46.963 "nvme_admin": false, 00:15:46.963 "nvme_io": false, 00:15:46.963 "nvme_io_md": false, 00:15:46.963 "write_zeroes": true, 00:15:46.963 "zcopy": true, 00:15:46.963 "get_zone_info": false, 00:15:46.963 "zone_management": false, 00:15:46.963 "zone_append": false, 00:15:46.963 "compare": false, 00:15:46.963 "compare_and_write": false, 00:15:46.963 "abort": true, 00:15:46.963 "seek_hole": false, 00:15:46.963 "seek_data": false, 00:15:46.963 "copy": true, 00:15:46.963 "nvme_iov_md": false 00:15:46.963 }, 00:15:46.963 "memory_domains": [ 00:15:46.963 { 00:15:46.963 "dma_device_id": "system", 00:15:46.963 "dma_device_type": 1 00:15:46.963 }, 00:15:46.963 { 00:15:46.963 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:46.963 "dma_device_type": 2 00:15:46.963 } 00:15:46.963 ], 00:15:46.963 "driver_specific": {} 00:15:46.963 }' 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:46.963 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.221 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.221 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.221 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.221 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.221 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.221 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:47.221 22:44:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.480 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.480 "name": "BaseBdev2", 00:15:47.480 "aliases": [ 00:15:47.480 "d2cd6c98-76c3-484f-a4b5-19d545175236" 00:15:47.480 ], 00:15:47.480 "product_name": "Malloc disk", 00:15:47.480 "block_size": 512, 00:15:47.480 "num_blocks": 65536, 00:15:47.480 "uuid": "d2cd6c98-76c3-484f-a4b5-19d545175236", 00:15:47.480 "assigned_rate_limits": { 00:15:47.480 "rw_ios_per_sec": 0, 00:15:47.480 "rw_mbytes_per_sec": 0, 00:15:47.480 "r_mbytes_per_sec": 0, 00:15:47.480 "w_mbytes_per_sec": 0 00:15:47.480 }, 00:15:47.480 "claimed": true, 00:15:47.480 "claim_type": "exclusive_write", 00:15:47.480 "zoned": false, 00:15:47.480 "supported_io_types": { 00:15:47.480 "read": true, 00:15:47.480 "write": true, 00:15:47.480 "unmap": true, 00:15:47.480 "flush": true, 00:15:47.480 "reset": true, 00:15:47.480 "nvme_admin": false, 00:15:47.480 "nvme_io": false, 00:15:47.480 "nvme_io_md": false, 00:15:47.480 "write_zeroes": true, 00:15:47.480 "zcopy": true, 00:15:47.480 "get_zone_info": false, 00:15:47.480 "zone_management": false, 00:15:47.480 "zone_append": false, 00:15:47.480 "compare": false, 00:15:47.480 "compare_and_write": false, 00:15:47.480 "abort": true, 00:15:47.480 "seek_hole": false, 00:15:47.480 "seek_data": false, 00:15:47.480 "copy": true, 00:15:47.480 "nvme_iov_md": false 00:15:47.480 }, 00:15:47.480 "memory_domains": [ 00:15:47.480 { 00:15:47.480 "dma_device_id": "system", 00:15:47.480 "dma_device_type": 1 00:15:47.480 }, 00:15:47.480 { 00:15:47.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.480 "dma_device_type": 2 00:15:47.480 } 00:15:47.480 ], 00:15:47.480 "driver_specific": {} 00:15:47.480 }' 00:15:47.480 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.480 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.480 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:47.480 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.481 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:47.739 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:47.739 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:47.740 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:47.999 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:47.999 "name": "BaseBdev3", 00:15:47.999 "aliases": [ 00:15:47.999 "4767fdab-9c24-4dc7-8b30-39cba2d44342" 00:15:47.999 ], 00:15:47.999 "product_name": "Malloc disk", 00:15:47.999 "block_size": 512, 00:15:47.999 "num_blocks": 65536, 00:15:47.999 "uuid": "4767fdab-9c24-4dc7-8b30-39cba2d44342", 00:15:47.999 "assigned_rate_limits": { 00:15:47.999 "rw_ios_per_sec": 0, 00:15:47.999 "rw_mbytes_per_sec": 0, 00:15:47.999 "r_mbytes_per_sec": 0, 00:15:47.999 "w_mbytes_per_sec": 0 00:15:47.999 }, 00:15:47.999 "claimed": true, 00:15:47.999 "claim_type": "exclusive_write", 00:15:47.999 "zoned": false, 00:15:47.999 "supported_io_types": { 00:15:47.999 "read": true, 00:15:47.999 "write": true, 00:15:47.999 "unmap": true, 00:15:47.999 "flush": true, 00:15:47.999 "reset": true, 00:15:47.999 "nvme_admin": false, 00:15:47.999 "nvme_io": false, 00:15:47.999 "nvme_io_md": false, 00:15:47.999 "write_zeroes": true, 00:15:47.999 "zcopy": true, 00:15:47.999 "get_zone_info": false, 00:15:47.999 "zone_management": false, 00:15:47.999 "zone_append": false, 00:15:47.999 "compare": false, 00:15:47.999 "compare_and_write": false, 00:15:47.999 "abort": true, 00:15:47.999 "seek_hole": false, 00:15:47.999 "seek_data": false, 00:15:47.999 "copy": true, 00:15:47.999 "nvme_iov_md": false 00:15:47.999 }, 00:15:47.999 "memory_domains": [ 00:15:47.999 { 00:15:47.999 "dma_device_id": "system", 00:15:47.999 "dma_device_type": 1 00:15:47.999 }, 00:15:47.999 { 00:15:47.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.999 "dma_device_type": 2 00:15:47.999 } 00:15:47.999 ], 00:15:47.999 "driver_specific": {} 00:15:47.999 }' 00:15:47.999 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:47.999 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:48.258 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:48.258 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:48.258 22:44:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:48.258 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:48.258 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:48.258 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:48.258 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:48.258 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.258 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:48.516 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:48.516 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:48.516 [2024-07-15 22:44:33.414099] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:48.516 [2024-07-15 22:44:33.414130] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:48.516 [2024-07-15 22:44:33.414172] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.775 "name": "Existed_Raid", 00:15:48.775 "uuid": "040c1e71-e869-4e81-98ee-4432799e8f94", 00:15:48.775 "strip_size_kb": 64, 00:15:48.775 "state": "offline", 00:15:48.775 "raid_level": "concat", 00:15:48.775 "superblock": true, 00:15:48.775 "num_base_bdevs": 3, 00:15:48.775 "num_base_bdevs_discovered": 2, 00:15:48.775 "num_base_bdevs_operational": 2, 00:15:48.775 "base_bdevs_list": [ 00:15:48.775 { 00:15:48.775 "name": null, 00:15:48.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.775 "is_configured": false, 00:15:48.775 "data_offset": 2048, 00:15:48.775 "data_size": 63488 00:15:48.775 }, 00:15:48.775 { 00:15:48.775 "name": "BaseBdev2", 00:15:48.775 "uuid": "d2cd6c98-76c3-484f-a4b5-19d545175236", 00:15:48.775 "is_configured": true, 00:15:48.775 "data_offset": 2048, 00:15:48.775 "data_size": 63488 00:15:48.775 }, 00:15:48.775 { 00:15:48.775 "name": "BaseBdev3", 00:15:48.775 "uuid": "4767fdab-9c24-4dc7-8b30-39cba2d44342", 00:15:48.775 "is_configured": true, 00:15:48.775 "data_offset": 2048, 00:15:48.775 "data_size": 63488 00:15:48.775 } 00:15:48.775 ] 00:15:48.775 }' 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.775 22:44:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.710 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:49.710 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:49.710 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.710 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:49.711 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:49.711 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:49.711 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:49.969 [2024-07-15 22:44:34.759341] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:49.969 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:49.969 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:49.969 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.969 22:44:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:50.227 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:50.227 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:50.227 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:50.486 [2024-07-15 22:44:35.283318] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:50.486 [2024-07-15 22:44:35.283366] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb7400 name Existed_Raid, state offline 00:15:50.486 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:50.486 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:50.486 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.486 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:50.761 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:50.761 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:50.761 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:50.761 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:50.761 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:50.761 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:51.020 BaseBdev2 00:15:51.020 22:44:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:51.020 22:44:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:51.020 22:44:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:51.020 22:44:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:51.020 22:44:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:51.020 22:44:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:51.020 22:44:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.277 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:51.535 [ 00:15:51.535 { 00:15:51.535 "name": "BaseBdev2", 00:15:51.535 "aliases": [ 00:15:51.535 "ddb8fdc8-e559-49d6-b386-dda0eced8080" 00:15:51.535 ], 00:15:51.535 "product_name": "Malloc disk", 00:15:51.535 "block_size": 512, 00:15:51.535 "num_blocks": 65536, 00:15:51.535 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:15:51.535 "assigned_rate_limits": { 00:15:51.535 "rw_ios_per_sec": 0, 00:15:51.535 "rw_mbytes_per_sec": 0, 00:15:51.535 "r_mbytes_per_sec": 0, 00:15:51.535 "w_mbytes_per_sec": 0 00:15:51.535 }, 00:15:51.535 "claimed": false, 00:15:51.535 "zoned": false, 00:15:51.535 "supported_io_types": { 00:15:51.535 "read": true, 00:15:51.535 "write": true, 00:15:51.535 "unmap": true, 00:15:51.535 "flush": true, 00:15:51.535 "reset": true, 00:15:51.535 "nvme_admin": false, 00:15:51.535 "nvme_io": false, 00:15:51.535 "nvme_io_md": false, 00:15:51.535 "write_zeroes": true, 00:15:51.535 "zcopy": true, 00:15:51.535 "get_zone_info": false, 00:15:51.535 "zone_management": false, 00:15:51.535 "zone_append": false, 00:15:51.535 "compare": false, 00:15:51.535 "compare_and_write": false, 00:15:51.535 "abort": true, 00:15:51.535 "seek_hole": false, 00:15:51.535 "seek_data": false, 00:15:51.535 "copy": true, 00:15:51.535 "nvme_iov_md": false 00:15:51.535 }, 00:15:51.535 "memory_domains": [ 00:15:51.535 { 00:15:51.535 "dma_device_id": "system", 00:15:51.535 "dma_device_type": 1 00:15:51.535 }, 00:15:51.535 { 00:15:51.535 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.535 "dma_device_type": 2 00:15:51.535 } 00:15:51.535 ], 00:15:51.535 "driver_specific": {} 00:15:51.535 } 00:15:51.535 ] 00:15:51.535 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:51.535 22:44:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:51.535 22:44:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:51.535 22:44:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:51.793 BaseBdev3 00:15:51.793 22:44:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:51.793 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:51.793 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:51.793 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:51.793 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:51.793 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:51.793 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:52.051 22:44:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:52.310 [ 00:15:52.310 { 00:15:52.310 "name": "BaseBdev3", 00:15:52.310 "aliases": [ 00:15:52.310 "d8e262d2-e222-465b-a232-f3e206623f79" 00:15:52.310 ], 00:15:52.310 "product_name": "Malloc disk", 00:15:52.310 "block_size": 512, 00:15:52.310 "num_blocks": 65536, 00:15:52.310 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:15:52.310 "assigned_rate_limits": { 00:15:52.310 "rw_ios_per_sec": 0, 00:15:52.310 "rw_mbytes_per_sec": 0, 00:15:52.310 "r_mbytes_per_sec": 0, 00:15:52.310 "w_mbytes_per_sec": 0 00:15:52.310 }, 00:15:52.310 "claimed": false, 00:15:52.310 "zoned": false, 00:15:52.310 "supported_io_types": { 00:15:52.310 "read": true, 00:15:52.310 "write": true, 00:15:52.310 "unmap": true, 00:15:52.310 "flush": true, 00:15:52.310 "reset": true, 00:15:52.310 "nvme_admin": false, 00:15:52.310 "nvme_io": false, 00:15:52.310 "nvme_io_md": false, 00:15:52.310 "write_zeroes": true, 00:15:52.310 "zcopy": true, 00:15:52.310 "get_zone_info": false, 00:15:52.310 "zone_management": false, 00:15:52.310 "zone_append": false, 00:15:52.310 "compare": false, 00:15:52.310 "compare_and_write": false, 00:15:52.310 "abort": true, 00:15:52.310 "seek_hole": false, 00:15:52.310 "seek_data": false, 00:15:52.310 "copy": true, 00:15:52.310 "nvme_iov_md": false 00:15:52.310 }, 00:15:52.310 "memory_domains": [ 00:15:52.310 { 00:15:52.310 "dma_device_id": "system", 00:15:52.310 "dma_device_type": 1 00:15:52.310 }, 00:15:52.310 { 00:15:52.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.310 "dma_device_type": 2 00:15:52.310 } 00:15:52.310 ], 00:15:52.310 "driver_specific": {} 00:15:52.310 } 00:15:52.310 ] 00:15:52.310 22:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:52.310 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:52.310 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:52.310 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:52.568 [2024-07-15 22:44:37.257960] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:52.568 [2024-07-15 22:44:37.258006] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:52.568 [2024-07-15 22:44:37.258025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:52.568 [2024-07-15 22:44:37.259408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:52.568 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:52.568 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.569 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.827 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.827 "name": "Existed_Raid", 00:15:52.827 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:15:52.827 "strip_size_kb": 64, 00:15:52.827 "state": "configuring", 00:15:52.827 "raid_level": "concat", 00:15:52.827 "superblock": true, 00:15:52.827 "num_base_bdevs": 3, 00:15:52.827 "num_base_bdevs_discovered": 2, 00:15:52.827 "num_base_bdevs_operational": 3, 00:15:52.827 "base_bdevs_list": [ 00:15:52.827 { 00:15:52.827 "name": "BaseBdev1", 00:15:52.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.827 "is_configured": false, 00:15:52.827 "data_offset": 0, 00:15:52.827 "data_size": 0 00:15:52.827 }, 00:15:52.827 { 00:15:52.827 "name": "BaseBdev2", 00:15:52.827 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:15:52.827 "is_configured": true, 00:15:52.827 "data_offset": 2048, 00:15:52.827 "data_size": 63488 00:15:52.827 }, 00:15:52.827 { 00:15:52.827 "name": "BaseBdev3", 00:15:52.827 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:15:52.827 "is_configured": true, 00:15:52.827 "data_offset": 2048, 00:15:52.827 "data_size": 63488 00:15:52.827 } 00:15:52.827 ] 00:15:52.827 }' 00:15:52.827 22:44:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.827 22:44:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:53.394 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:53.652 [2024-07-15 22:44:38.340796] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.652 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:53.911 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:53.911 "name": "Existed_Raid", 00:15:53.911 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:15:53.911 "strip_size_kb": 64, 00:15:53.911 "state": "configuring", 00:15:53.911 "raid_level": "concat", 00:15:53.911 "superblock": true, 00:15:53.911 "num_base_bdevs": 3, 00:15:53.911 "num_base_bdevs_discovered": 1, 00:15:53.911 "num_base_bdevs_operational": 3, 00:15:53.911 "base_bdevs_list": [ 00:15:53.911 { 00:15:53.911 "name": "BaseBdev1", 00:15:53.911 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:53.911 "is_configured": false, 00:15:53.911 "data_offset": 0, 00:15:53.911 "data_size": 0 00:15:53.911 }, 00:15:53.911 { 00:15:53.911 "name": null, 00:15:53.911 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:15:53.911 "is_configured": false, 00:15:53.911 "data_offset": 2048, 00:15:53.911 "data_size": 63488 00:15:53.911 }, 00:15:53.911 { 00:15:53.911 "name": "BaseBdev3", 00:15:53.911 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:15:53.911 "is_configured": true, 00:15:53.911 "data_offset": 2048, 00:15:53.911 "data_size": 63488 00:15:53.911 } 00:15:53.911 ] 00:15:53.911 }' 00:15:53.911 22:44:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:53.911 22:44:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.478 22:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:54.478 22:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.737 22:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:54.737 22:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:54.996 [2024-07-15 22:44:39.688956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:54.996 BaseBdev1 00:15:54.996 22:44:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:54.996 22:44:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:54.996 22:44:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:54.996 22:44:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:54.996 22:44:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:54.996 22:44:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:54.996 22:44:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:55.297 22:44:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:55.297 [ 00:15:55.297 { 00:15:55.297 "name": "BaseBdev1", 00:15:55.297 "aliases": [ 00:15:55.297 "bd069afb-1928-4b77-9bf1-a50763428d99" 00:15:55.297 ], 00:15:55.297 "product_name": "Malloc disk", 00:15:55.297 "block_size": 512, 00:15:55.297 "num_blocks": 65536, 00:15:55.297 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:15:55.297 "assigned_rate_limits": { 00:15:55.297 "rw_ios_per_sec": 0, 00:15:55.297 "rw_mbytes_per_sec": 0, 00:15:55.297 "r_mbytes_per_sec": 0, 00:15:55.297 "w_mbytes_per_sec": 0 00:15:55.297 }, 00:15:55.297 "claimed": true, 00:15:55.297 "claim_type": "exclusive_write", 00:15:55.297 "zoned": false, 00:15:55.297 "supported_io_types": { 00:15:55.297 "read": true, 00:15:55.297 "write": true, 00:15:55.297 "unmap": true, 00:15:55.297 "flush": true, 00:15:55.297 "reset": true, 00:15:55.297 "nvme_admin": false, 00:15:55.297 "nvme_io": false, 00:15:55.297 "nvme_io_md": false, 00:15:55.297 "write_zeroes": true, 00:15:55.297 "zcopy": true, 00:15:55.297 "get_zone_info": false, 00:15:55.297 "zone_management": false, 00:15:55.297 "zone_append": false, 00:15:55.297 "compare": false, 00:15:55.297 "compare_and_write": false, 00:15:55.297 "abort": true, 00:15:55.297 "seek_hole": false, 00:15:55.297 "seek_data": false, 00:15:55.297 "copy": true, 00:15:55.297 "nvme_iov_md": false 00:15:55.297 }, 00:15:55.297 "memory_domains": [ 00:15:55.297 { 00:15:55.297 "dma_device_id": "system", 00:15:55.297 "dma_device_type": 1 00:15:55.297 }, 00:15:55.297 { 00:15:55.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.297 "dma_device_type": 2 00:15:55.297 } 00:15:55.297 ], 00:15:55.297 "driver_specific": {} 00:15:55.297 } 00:15:55.297 ] 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.297 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.556 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.556 "name": "Existed_Raid", 00:15:55.556 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:15:55.556 "strip_size_kb": 64, 00:15:55.556 "state": "configuring", 00:15:55.556 "raid_level": "concat", 00:15:55.556 "superblock": true, 00:15:55.556 "num_base_bdevs": 3, 00:15:55.556 "num_base_bdevs_discovered": 2, 00:15:55.556 "num_base_bdevs_operational": 3, 00:15:55.556 "base_bdevs_list": [ 00:15:55.556 { 00:15:55.556 "name": "BaseBdev1", 00:15:55.556 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:15:55.556 "is_configured": true, 00:15:55.556 "data_offset": 2048, 00:15:55.556 "data_size": 63488 00:15:55.556 }, 00:15:55.556 { 00:15:55.556 "name": null, 00:15:55.556 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:15:55.556 "is_configured": false, 00:15:55.556 "data_offset": 2048, 00:15:55.556 "data_size": 63488 00:15:55.556 }, 00:15:55.556 { 00:15:55.556 "name": "BaseBdev3", 00:15:55.556 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:15:55.556 "is_configured": true, 00:15:55.556 "data_offset": 2048, 00:15:55.556 "data_size": 63488 00:15:55.556 } 00:15:55.556 ] 00:15:55.556 }' 00:15:55.556 22:44:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.556 22:44:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.151 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.151 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:56.409 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:56.409 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:56.978 [2024-07-15 22:44:41.766485] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.978 22:44:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.245 22:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.245 "name": "Existed_Raid", 00:15:57.245 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:15:57.245 "strip_size_kb": 64, 00:15:57.245 "state": "configuring", 00:15:57.245 "raid_level": "concat", 00:15:57.245 "superblock": true, 00:15:57.245 "num_base_bdevs": 3, 00:15:57.245 "num_base_bdevs_discovered": 1, 00:15:57.245 "num_base_bdevs_operational": 3, 00:15:57.245 "base_bdevs_list": [ 00:15:57.245 { 00:15:57.245 "name": "BaseBdev1", 00:15:57.245 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:15:57.245 "is_configured": true, 00:15:57.245 "data_offset": 2048, 00:15:57.245 "data_size": 63488 00:15:57.245 }, 00:15:57.245 { 00:15:57.245 "name": null, 00:15:57.245 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:15:57.245 "is_configured": false, 00:15:57.245 "data_offset": 2048, 00:15:57.245 "data_size": 63488 00:15:57.245 }, 00:15:57.245 { 00:15:57.245 "name": null, 00:15:57.245 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:15:57.245 "is_configured": false, 00:15:57.245 "data_offset": 2048, 00:15:57.245 "data_size": 63488 00:15:57.245 } 00:15:57.245 ] 00:15:57.245 }' 00:15:57.245 22:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.245 22:44:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:57.811 22:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.811 22:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:58.070 22:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:58.070 22:44:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:58.636 [2024-07-15 22:44:43.386804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.636 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:58.894 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:58.894 "name": "Existed_Raid", 00:15:58.894 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:15:58.894 "strip_size_kb": 64, 00:15:58.894 "state": "configuring", 00:15:58.894 "raid_level": "concat", 00:15:58.894 "superblock": true, 00:15:58.894 "num_base_bdevs": 3, 00:15:58.894 "num_base_bdevs_discovered": 2, 00:15:58.894 "num_base_bdevs_operational": 3, 00:15:58.894 "base_bdevs_list": [ 00:15:58.894 { 00:15:58.894 "name": "BaseBdev1", 00:15:58.894 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:15:58.894 "is_configured": true, 00:15:58.894 "data_offset": 2048, 00:15:58.894 "data_size": 63488 00:15:58.894 }, 00:15:58.894 { 00:15:58.894 "name": null, 00:15:58.894 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:15:58.894 "is_configured": false, 00:15:58.894 "data_offset": 2048, 00:15:58.894 "data_size": 63488 00:15:58.894 }, 00:15:58.894 { 00:15:58.894 "name": "BaseBdev3", 00:15:58.894 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:15:58.894 "is_configured": true, 00:15:58.894 "data_offset": 2048, 00:15:58.894 "data_size": 63488 00:15:58.894 } 00:15:58.894 ] 00:15:58.894 }' 00:15:58.894 22:44:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:58.894 22:44:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:59.461 22:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.461 22:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:00.027 22:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:00.027 22:44:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:00.594 [2024-07-15 22:44:45.251781] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.594 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.853 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.853 "name": "Existed_Raid", 00:16:00.853 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:16:00.853 "strip_size_kb": 64, 00:16:00.853 "state": "configuring", 00:16:00.853 "raid_level": "concat", 00:16:00.853 "superblock": true, 00:16:00.853 "num_base_bdevs": 3, 00:16:00.853 "num_base_bdevs_discovered": 1, 00:16:00.853 "num_base_bdevs_operational": 3, 00:16:00.853 "base_bdevs_list": [ 00:16:00.853 { 00:16:00.853 "name": null, 00:16:00.853 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:16:00.853 "is_configured": false, 00:16:00.853 "data_offset": 2048, 00:16:00.853 "data_size": 63488 00:16:00.853 }, 00:16:00.853 { 00:16:00.853 "name": null, 00:16:00.853 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:16:00.853 "is_configured": false, 00:16:00.853 "data_offset": 2048, 00:16:00.853 "data_size": 63488 00:16:00.853 }, 00:16:00.853 { 00:16:00.853 "name": "BaseBdev3", 00:16:00.853 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:16:00.853 "is_configured": true, 00:16:00.853 "data_offset": 2048, 00:16:00.853 "data_size": 63488 00:16:00.853 } 00:16:00.853 ] 00:16:00.853 }' 00:16:00.853 22:44:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.853 22:44:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:01.420 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.420 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:01.678 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:01.678 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:02.242 [2024-07-15 22:44:46.858566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.242 22:44:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.242 22:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.242 "name": "Existed_Raid", 00:16:02.242 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:16:02.242 "strip_size_kb": 64, 00:16:02.242 "state": "configuring", 00:16:02.242 "raid_level": "concat", 00:16:02.242 "superblock": true, 00:16:02.242 "num_base_bdevs": 3, 00:16:02.242 "num_base_bdevs_discovered": 2, 00:16:02.242 "num_base_bdevs_operational": 3, 00:16:02.242 "base_bdevs_list": [ 00:16:02.242 { 00:16:02.242 "name": null, 00:16:02.242 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:16:02.242 "is_configured": false, 00:16:02.242 "data_offset": 2048, 00:16:02.242 "data_size": 63488 00:16:02.242 }, 00:16:02.242 { 00:16:02.242 "name": "BaseBdev2", 00:16:02.242 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:16:02.242 "is_configured": true, 00:16:02.242 "data_offset": 2048, 00:16:02.242 "data_size": 63488 00:16:02.242 }, 00:16:02.242 { 00:16:02.242 "name": "BaseBdev3", 00:16:02.242 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:16:02.242 "is_configured": true, 00:16:02.242 "data_offset": 2048, 00:16:02.242 "data_size": 63488 00:16:02.242 } 00:16:02.242 ] 00:16:02.242 }' 00:16:02.242 22:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.242 22:44:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.175 22:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.175 22:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:03.175 22:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:03.175 22:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.175 22:44:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:03.433 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u bd069afb-1928-4b77-9bf1-a50763428d99 00:16:03.692 [2024-07-15 22:44:48.459331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:03.692 [2024-07-15 22:44:48.459494] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcb5f50 00:16:03.692 [2024-07-15 22:44:48.459508] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:03.692 [2024-07-15 22:44:48.459686] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9bc940 00:16:03.692 [2024-07-15 22:44:48.459802] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcb5f50 00:16:03.692 [2024-07-15 22:44:48.459812] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcb5f50 00:16:03.692 [2024-07-15 22:44:48.459906] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.692 NewBaseBdev 00:16:03.692 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:03.692 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:03.692 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:03.692 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:03.692 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:03.692 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:03.692 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:03.950 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:04.208 [ 00:16:04.208 { 00:16:04.208 "name": "NewBaseBdev", 00:16:04.208 "aliases": [ 00:16:04.208 "bd069afb-1928-4b77-9bf1-a50763428d99" 00:16:04.208 ], 00:16:04.208 "product_name": "Malloc disk", 00:16:04.208 "block_size": 512, 00:16:04.208 "num_blocks": 65536, 00:16:04.208 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:16:04.208 "assigned_rate_limits": { 00:16:04.208 "rw_ios_per_sec": 0, 00:16:04.208 "rw_mbytes_per_sec": 0, 00:16:04.208 "r_mbytes_per_sec": 0, 00:16:04.208 "w_mbytes_per_sec": 0 00:16:04.208 }, 00:16:04.208 "claimed": true, 00:16:04.208 "claim_type": "exclusive_write", 00:16:04.208 "zoned": false, 00:16:04.208 "supported_io_types": { 00:16:04.208 "read": true, 00:16:04.208 "write": true, 00:16:04.208 "unmap": true, 00:16:04.208 "flush": true, 00:16:04.208 "reset": true, 00:16:04.208 "nvme_admin": false, 00:16:04.208 "nvme_io": false, 00:16:04.208 "nvme_io_md": false, 00:16:04.208 "write_zeroes": true, 00:16:04.208 "zcopy": true, 00:16:04.208 "get_zone_info": false, 00:16:04.208 "zone_management": false, 00:16:04.208 "zone_append": false, 00:16:04.208 "compare": false, 00:16:04.208 "compare_and_write": false, 00:16:04.208 "abort": true, 00:16:04.208 "seek_hole": false, 00:16:04.208 "seek_data": false, 00:16:04.208 "copy": true, 00:16:04.208 "nvme_iov_md": false 00:16:04.208 }, 00:16:04.208 "memory_domains": [ 00:16:04.208 { 00:16:04.208 "dma_device_id": "system", 00:16:04.208 "dma_device_type": 1 00:16:04.208 }, 00:16:04.208 { 00:16:04.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.208 "dma_device_type": 2 00:16:04.208 } 00:16:04.208 ], 00:16:04.208 "driver_specific": {} 00:16:04.208 } 00:16:04.208 ] 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.208 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.209 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.209 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.209 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.209 22:44:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.467 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.467 "name": "Existed_Raid", 00:16:04.467 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:16:04.467 "strip_size_kb": 64, 00:16:04.467 "state": "online", 00:16:04.467 "raid_level": "concat", 00:16:04.467 "superblock": true, 00:16:04.467 "num_base_bdevs": 3, 00:16:04.467 "num_base_bdevs_discovered": 3, 00:16:04.467 "num_base_bdevs_operational": 3, 00:16:04.467 "base_bdevs_list": [ 00:16:04.467 { 00:16:04.467 "name": "NewBaseBdev", 00:16:04.467 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:16:04.467 "is_configured": true, 00:16:04.467 "data_offset": 2048, 00:16:04.467 "data_size": 63488 00:16:04.467 }, 00:16:04.467 { 00:16:04.467 "name": "BaseBdev2", 00:16:04.467 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:16:04.467 "is_configured": true, 00:16:04.467 "data_offset": 2048, 00:16:04.467 "data_size": 63488 00:16:04.467 }, 00:16:04.467 { 00:16:04.467 "name": "BaseBdev3", 00:16:04.467 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:16:04.467 "is_configured": true, 00:16:04.467 "data_offset": 2048, 00:16:04.467 "data_size": 63488 00:16:04.467 } 00:16:04.467 ] 00:16:04.467 }' 00:16:04.467 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.467 22:44:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:05.034 22:44:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:05.292 [2024-07-15 22:44:50.055884] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:05.292 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:05.292 "name": "Existed_Raid", 00:16:05.292 "aliases": [ 00:16:05.292 "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d" 00:16:05.292 ], 00:16:05.292 "product_name": "Raid Volume", 00:16:05.292 "block_size": 512, 00:16:05.292 "num_blocks": 190464, 00:16:05.292 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:16:05.292 "assigned_rate_limits": { 00:16:05.292 "rw_ios_per_sec": 0, 00:16:05.292 "rw_mbytes_per_sec": 0, 00:16:05.292 "r_mbytes_per_sec": 0, 00:16:05.292 "w_mbytes_per_sec": 0 00:16:05.292 }, 00:16:05.293 "claimed": false, 00:16:05.293 "zoned": false, 00:16:05.293 "supported_io_types": { 00:16:05.293 "read": true, 00:16:05.293 "write": true, 00:16:05.293 "unmap": true, 00:16:05.293 "flush": true, 00:16:05.293 "reset": true, 00:16:05.293 "nvme_admin": false, 00:16:05.293 "nvme_io": false, 00:16:05.293 "nvme_io_md": false, 00:16:05.293 "write_zeroes": true, 00:16:05.293 "zcopy": false, 00:16:05.293 "get_zone_info": false, 00:16:05.293 "zone_management": false, 00:16:05.293 "zone_append": false, 00:16:05.293 "compare": false, 00:16:05.293 "compare_and_write": false, 00:16:05.293 "abort": false, 00:16:05.293 "seek_hole": false, 00:16:05.293 "seek_data": false, 00:16:05.293 "copy": false, 00:16:05.293 "nvme_iov_md": false 00:16:05.293 }, 00:16:05.293 "memory_domains": [ 00:16:05.293 { 00:16:05.293 "dma_device_id": "system", 00:16:05.293 "dma_device_type": 1 00:16:05.293 }, 00:16:05.293 { 00:16:05.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.293 "dma_device_type": 2 00:16:05.293 }, 00:16:05.293 { 00:16:05.293 "dma_device_id": "system", 00:16:05.293 "dma_device_type": 1 00:16:05.293 }, 00:16:05.293 { 00:16:05.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.293 "dma_device_type": 2 00:16:05.293 }, 00:16:05.293 { 00:16:05.293 "dma_device_id": "system", 00:16:05.293 "dma_device_type": 1 00:16:05.293 }, 00:16:05.293 { 00:16:05.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.293 "dma_device_type": 2 00:16:05.293 } 00:16:05.293 ], 00:16:05.293 "driver_specific": { 00:16:05.293 "raid": { 00:16:05.293 "uuid": "01ba8c37-99f4-41a0-bde4-0fd7fd2d0d8d", 00:16:05.293 "strip_size_kb": 64, 00:16:05.293 "state": "online", 00:16:05.293 "raid_level": "concat", 00:16:05.293 "superblock": true, 00:16:05.293 "num_base_bdevs": 3, 00:16:05.293 "num_base_bdevs_discovered": 3, 00:16:05.293 "num_base_bdevs_operational": 3, 00:16:05.293 "base_bdevs_list": [ 00:16:05.293 { 00:16:05.293 "name": "NewBaseBdev", 00:16:05.293 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:16:05.293 "is_configured": true, 00:16:05.293 "data_offset": 2048, 00:16:05.293 "data_size": 63488 00:16:05.293 }, 00:16:05.293 { 00:16:05.293 "name": "BaseBdev2", 00:16:05.293 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:16:05.293 "is_configured": true, 00:16:05.293 "data_offset": 2048, 00:16:05.293 "data_size": 63488 00:16:05.293 }, 00:16:05.293 { 00:16:05.293 "name": "BaseBdev3", 00:16:05.293 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:16:05.293 "is_configured": true, 00:16:05.293 "data_offset": 2048, 00:16:05.293 "data_size": 63488 00:16:05.293 } 00:16:05.293 ] 00:16:05.293 } 00:16:05.293 } 00:16:05.293 }' 00:16:05.293 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:05.293 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:05.293 BaseBdev2 00:16:05.293 BaseBdev3' 00:16:05.293 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.293 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:05.293 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.551 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.551 "name": "NewBaseBdev", 00:16:05.551 "aliases": [ 00:16:05.551 "bd069afb-1928-4b77-9bf1-a50763428d99" 00:16:05.551 ], 00:16:05.551 "product_name": "Malloc disk", 00:16:05.551 "block_size": 512, 00:16:05.551 "num_blocks": 65536, 00:16:05.551 "uuid": "bd069afb-1928-4b77-9bf1-a50763428d99", 00:16:05.551 "assigned_rate_limits": { 00:16:05.551 "rw_ios_per_sec": 0, 00:16:05.551 "rw_mbytes_per_sec": 0, 00:16:05.551 "r_mbytes_per_sec": 0, 00:16:05.551 "w_mbytes_per_sec": 0 00:16:05.551 }, 00:16:05.551 "claimed": true, 00:16:05.551 "claim_type": "exclusive_write", 00:16:05.551 "zoned": false, 00:16:05.551 "supported_io_types": { 00:16:05.551 "read": true, 00:16:05.551 "write": true, 00:16:05.551 "unmap": true, 00:16:05.551 "flush": true, 00:16:05.551 "reset": true, 00:16:05.551 "nvme_admin": false, 00:16:05.551 "nvme_io": false, 00:16:05.551 "nvme_io_md": false, 00:16:05.551 "write_zeroes": true, 00:16:05.551 "zcopy": true, 00:16:05.551 "get_zone_info": false, 00:16:05.551 "zone_management": false, 00:16:05.551 "zone_append": false, 00:16:05.551 "compare": false, 00:16:05.551 "compare_and_write": false, 00:16:05.551 "abort": true, 00:16:05.551 "seek_hole": false, 00:16:05.551 "seek_data": false, 00:16:05.551 "copy": true, 00:16:05.551 "nvme_iov_md": false 00:16:05.551 }, 00:16:05.551 "memory_domains": [ 00:16:05.551 { 00:16:05.551 "dma_device_id": "system", 00:16:05.551 "dma_device_type": 1 00:16:05.551 }, 00:16:05.551 { 00:16:05.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.551 "dma_device_type": 2 00:16:05.551 } 00:16:05.551 ], 00:16:05.551 "driver_specific": {} 00:16:05.551 }' 00:16:05.551 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.551 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:05.810 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.068 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.068 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.068 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:06.068 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.327 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.327 "name": "BaseBdev2", 00:16:06.327 "aliases": [ 00:16:06.327 "ddb8fdc8-e559-49d6-b386-dda0eced8080" 00:16:06.327 ], 00:16:06.327 "product_name": "Malloc disk", 00:16:06.327 "block_size": 512, 00:16:06.327 "num_blocks": 65536, 00:16:06.327 "uuid": "ddb8fdc8-e559-49d6-b386-dda0eced8080", 00:16:06.327 "assigned_rate_limits": { 00:16:06.327 "rw_ios_per_sec": 0, 00:16:06.327 "rw_mbytes_per_sec": 0, 00:16:06.327 "r_mbytes_per_sec": 0, 00:16:06.327 "w_mbytes_per_sec": 0 00:16:06.327 }, 00:16:06.327 "claimed": true, 00:16:06.327 "claim_type": "exclusive_write", 00:16:06.327 "zoned": false, 00:16:06.327 "supported_io_types": { 00:16:06.327 "read": true, 00:16:06.327 "write": true, 00:16:06.327 "unmap": true, 00:16:06.327 "flush": true, 00:16:06.327 "reset": true, 00:16:06.327 "nvme_admin": false, 00:16:06.327 "nvme_io": false, 00:16:06.327 "nvme_io_md": false, 00:16:06.327 "write_zeroes": true, 00:16:06.327 "zcopy": true, 00:16:06.327 "get_zone_info": false, 00:16:06.327 "zone_management": false, 00:16:06.327 "zone_append": false, 00:16:06.327 "compare": false, 00:16:06.327 "compare_and_write": false, 00:16:06.327 "abort": true, 00:16:06.327 "seek_hole": false, 00:16:06.327 "seek_data": false, 00:16:06.327 "copy": true, 00:16:06.327 "nvme_iov_md": false 00:16:06.327 }, 00:16:06.327 "memory_domains": [ 00:16:06.327 { 00:16:06.327 "dma_device_id": "system", 00:16:06.327 "dma_device_type": 1 00:16:06.327 }, 00:16:06.327 { 00:16:06.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.327 "dma_device_type": 2 00:16:06.327 } 00:16:06.327 ], 00:16:06.327 "driver_specific": {} 00:16:06.327 }' 00:16:06.327 22:44:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.327 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.327 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.327 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.327 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.327 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.327 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.327 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.585 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.585 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.585 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.585 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.585 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.585 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:06.585 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.843 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.844 "name": "BaseBdev3", 00:16:06.844 "aliases": [ 00:16:06.844 "d8e262d2-e222-465b-a232-f3e206623f79" 00:16:06.844 ], 00:16:06.844 "product_name": "Malloc disk", 00:16:06.844 "block_size": 512, 00:16:06.844 "num_blocks": 65536, 00:16:06.844 "uuid": "d8e262d2-e222-465b-a232-f3e206623f79", 00:16:06.844 "assigned_rate_limits": { 00:16:06.844 "rw_ios_per_sec": 0, 00:16:06.844 "rw_mbytes_per_sec": 0, 00:16:06.844 "r_mbytes_per_sec": 0, 00:16:06.844 "w_mbytes_per_sec": 0 00:16:06.844 }, 00:16:06.844 "claimed": true, 00:16:06.844 "claim_type": "exclusive_write", 00:16:06.844 "zoned": false, 00:16:06.844 "supported_io_types": { 00:16:06.844 "read": true, 00:16:06.844 "write": true, 00:16:06.844 "unmap": true, 00:16:06.844 "flush": true, 00:16:06.844 "reset": true, 00:16:06.844 "nvme_admin": false, 00:16:06.844 "nvme_io": false, 00:16:06.844 "nvme_io_md": false, 00:16:06.844 "write_zeroes": true, 00:16:06.844 "zcopy": true, 00:16:06.844 "get_zone_info": false, 00:16:06.844 "zone_management": false, 00:16:06.844 "zone_append": false, 00:16:06.844 "compare": false, 00:16:06.844 "compare_and_write": false, 00:16:06.844 "abort": true, 00:16:06.844 "seek_hole": false, 00:16:06.844 "seek_data": false, 00:16:06.844 "copy": true, 00:16:06.844 "nvme_iov_md": false 00:16:06.844 }, 00:16:06.844 "memory_domains": [ 00:16:06.844 { 00:16:06.844 "dma_device_id": "system", 00:16:06.844 "dma_device_type": 1 00:16:06.844 }, 00:16:06.844 { 00:16:06.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.844 "dma_device_type": 2 00:16:06.844 } 00:16:06.844 ], 00:16:06.844 "driver_specific": {} 00:16:06.844 }' 00:16:06.844 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.844 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.844 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.844 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.844 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.844 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.844 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.103 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.103 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.103 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.103 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.103 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.103 22:44:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:07.362 [2024-07-15 22:44:52.133093] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:07.362 [2024-07-15 22:44:52.133122] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:07.362 [2024-07-15 22:44:52.133174] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:07.362 [2024-07-15 22:44:52.133227] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:07.362 [2024-07-15 22:44:52.133240] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcb5f50 name Existed_Raid, state offline 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2736830 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2736830 ']' 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2736830 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2736830 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2736830' 00:16:07.362 killing process with pid 2736830 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2736830 00:16:07.362 [2024-07-15 22:44:52.207229] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:07.362 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2736830 00:16:07.362 [2024-07-15 22:44:52.234653] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:07.621 22:44:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:07.621 00:16:07.621 real 0m29.575s 00:16:07.621 user 0m54.291s 00:16:07.621 sys 0m5.230s 00:16:07.621 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:07.621 22:44:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.621 ************************************ 00:16:07.621 END TEST raid_state_function_test_sb 00:16:07.621 ************************************ 00:16:07.621 22:44:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:07.621 22:44:52 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:16:07.621 22:44:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:07.621 22:44:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:07.621 22:44:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:07.882 ************************************ 00:16:07.882 START TEST raid_superblock_test 00:16:07.882 ************************************ 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2741275 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2741275 /var/tmp/spdk-raid.sock 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2741275 ']' 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:07.882 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:07.882 22:44:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.882 [2024-07-15 22:44:52.598354] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:16:07.882 [2024-07-15 22:44:52.598423] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2741275 ] 00:16:07.882 [2024-07-15 22:44:52.729412] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:08.141 [2024-07-15 22:44:52.828474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:08.400 [2024-07-15 22:44:53.198370] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:08.400 [2024-07-15 22:44:53.198417] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:08.658 22:44:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:08.658 22:44:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:08.658 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:08.658 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:08.659 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:08.659 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:08.659 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:08.659 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:08.659 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:08.659 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:08.659 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:08.916 malloc1 00:16:08.916 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:09.174 [2024-07-15 22:44:53.958668] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:09.174 [2024-07-15 22:44:53.958718] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:09.174 [2024-07-15 22:44:53.958737] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b9a570 00:16:09.174 [2024-07-15 22:44:53.958750] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:09.174 [2024-07-15 22:44:53.960348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:09.174 [2024-07-15 22:44:53.960379] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:09.174 pt1 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:09.174 22:44:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:09.434 malloc2 00:16:09.434 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:09.753 [2024-07-15 22:44:54.464814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:09.753 [2024-07-15 22:44:54.464865] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:09.753 [2024-07-15 22:44:54.464883] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b9b970 00:16:09.753 [2024-07-15 22:44:54.464896] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:09.753 [2024-07-15 22:44:54.466462] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:09.753 [2024-07-15 22:44:54.466492] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:09.753 pt2 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:09.753 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:10.011 malloc3 00:16:10.011 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:10.289 [2024-07-15 22:44:54.970893] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:10.289 [2024-07-15 22:44:54.970951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:10.289 [2024-07-15 22:44:54.970969] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d32340 00:16:10.289 [2024-07-15 22:44:54.970982] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:10.289 [2024-07-15 22:44:54.972395] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:10.289 [2024-07-15 22:44:54.972424] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:10.289 pt3 00:16:10.289 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:10.289 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:10.289 22:44:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:10.548 [2024-07-15 22:44:55.219567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:10.548 [2024-07-15 22:44:55.220763] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:10.548 [2024-07-15 22:44:55.220817] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:10.548 [2024-07-15 22:44:55.220975] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b92ea0 00:16:10.548 [2024-07-15 22:44:55.220988] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:10.548 [2024-07-15 22:44:55.221175] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b9a240 00:16:10.548 [2024-07-15 22:44:55.221315] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b92ea0 00:16:10.548 [2024-07-15 22:44:55.221325] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b92ea0 00:16:10.548 [2024-07-15 22:44:55.221419] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.548 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:10.806 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.806 "name": "raid_bdev1", 00:16:10.806 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:10.806 "strip_size_kb": 64, 00:16:10.806 "state": "online", 00:16:10.806 "raid_level": "concat", 00:16:10.806 "superblock": true, 00:16:10.806 "num_base_bdevs": 3, 00:16:10.806 "num_base_bdevs_discovered": 3, 00:16:10.806 "num_base_bdevs_operational": 3, 00:16:10.807 "base_bdevs_list": [ 00:16:10.807 { 00:16:10.807 "name": "pt1", 00:16:10.807 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:10.807 "is_configured": true, 00:16:10.807 "data_offset": 2048, 00:16:10.807 "data_size": 63488 00:16:10.807 }, 00:16:10.807 { 00:16:10.807 "name": "pt2", 00:16:10.807 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:10.807 "is_configured": true, 00:16:10.807 "data_offset": 2048, 00:16:10.807 "data_size": 63488 00:16:10.807 }, 00:16:10.807 { 00:16:10.807 "name": "pt3", 00:16:10.807 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:10.807 "is_configured": true, 00:16:10.807 "data_offset": 2048, 00:16:10.807 "data_size": 63488 00:16:10.807 } 00:16:10.807 ] 00:16:10.807 }' 00:16:10.807 22:44:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.807 22:44:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:11.374 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:11.374 [2024-07-15 22:44:56.262607] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:11.633 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:11.633 "name": "raid_bdev1", 00:16:11.633 "aliases": [ 00:16:11.633 "48218300-cc23-4787-8205-c62d9336497e" 00:16:11.633 ], 00:16:11.633 "product_name": "Raid Volume", 00:16:11.633 "block_size": 512, 00:16:11.633 "num_blocks": 190464, 00:16:11.633 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:11.633 "assigned_rate_limits": { 00:16:11.633 "rw_ios_per_sec": 0, 00:16:11.633 "rw_mbytes_per_sec": 0, 00:16:11.633 "r_mbytes_per_sec": 0, 00:16:11.633 "w_mbytes_per_sec": 0 00:16:11.633 }, 00:16:11.633 "claimed": false, 00:16:11.633 "zoned": false, 00:16:11.633 "supported_io_types": { 00:16:11.633 "read": true, 00:16:11.633 "write": true, 00:16:11.633 "unmap": true, 00:16:11.633 "flush": true, 00:16:11.633 "reset": true, 00:16:11.633 "nvme_admin": false, 00:16:11.633 "nvme_io": false, 00:16:11.633 "nvme_io_md": false, 00:16:11.633 "write_zeroes": true, 00:16:11.633 "zcopy": false, 00:16:11.633 "get_zone_info": false, 00:16:11.633 "zone_management": false, 00:16:11.633 "zone_append": false, 00:16:11.633 "compare": false, 00:16:11.633 "compare_and_write": false, 00:16:11.633 "abort": false, 00:16:11.633 "seek_hole": false, 00:16:11.633 "seek_data": false, 00:16:11.633 "copy": false, 00:16:11.633 "nvme_iov_md": false 00:16:11.633 }, 00:16:11.633 "memory_domains": [ 00:16:11.633 { 00:16:11.633 "dma_device_id": "system", 00:16:11.633 "dma_device_type": 1 00:16:11.633 }, 00:16:11.633 { 00:16:11.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.634 "dma_device_type": 2 00:16:11.634 }, 00:16:11.634 { 00:16:11.634 "dma_device_id": "system", 00:16:11.634 "dma_device_type": 1 00:16:11.634 }, 00:16:11.634 { 00:16:11.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.634 "dma_device_type": 2 00:16:11.634 }, 00:16:11.634 { 00:16:11.634 "dma_device_id": "system", 00:16:11.634 "dma_device_type": 1 00:16:11.634 }, 00:16:11.634 { 00:16:11.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.634 "dma_device_type": 2 00:16:11.634 } 00:16:11.634 ], 00:16:11.634 "driver_specific": { 00:16:11.634 "raid": { 00:16:11.634 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:11.634 "strip_size_kb": 64, 00:16:11.634 "state": "online", 00:16:11.634 "raid_level": "concat", 00:16:11.634 "superblock": true, 00:16:11.634 "num_base_bdevs": 3, 00:16:11.634 "num_base_bdevs_discovered": 3, 00:16:11.634 "num_base_bdevs_operational": 3, 00:16:11.634 "base_bdevs_list": [ 00:16:11.634 { 00:16:11.634 "name": "pt1", 00:16:11.634 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:11.634 "is_configured": true, 00:16:11.634 "data_offset": 2048, 00:16:11.634 "data_size": 63488 00:16:11.634 }, 00:16:11.634 { 00:16:11.634 "name": "pt2", 00:16:11.634 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:11.634 "is_configured": true, 00:16:11.634 "data_offset": 2048, 00:16:11.634 "data_size": 63488 00:16:11.634 }, 00:16:11.634 { 00:16:11.634 "name": "pt3", 00:16:11.634 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:11.634 "is_configured": true, 00:16:11.634 "data_offset": 2048, 00:16:11.634 "data_size": 63488 00:16:11.634 } 00:16:11.634 ] 00:16:11.634 } 00:16:11.634 } 00:16:11.634 }' 00:16:11.634 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:11.634 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:11.634 pt2 00:16:11.634 pt3' 00:16:11.634 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:11.634 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:11.634 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:11.893 "name": "pt1", 00:16:11.893 "aliases": [ 00:16:11.893 "00000000-0000-0000-0000-000000000001" 00:16:11.893 ], 00:16:11.893 "product_name": "passthru", 00:16:11.893 "block_size": 512, 00:16:11.893 "num_blocks": 65536, 00:16:11.893 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:11.893 "assigned_rate_limits": { 00:16:11.893 "rw_ios_per_sec": 0, 00:16:11.893 "rw_mbytes_per_sec": 0, 00:16:11.893 "r_mbytes_per_sec": 0, 00:16:11.893 "w_mbytes_per_sec": 0 00:16:11.893 }, 00:16:11.893 "claimed": true, 00:16:11.893 "claim_type": "exclusive_write", 00:16:11.893 "zoned": false, 00:16:11.893 "supported_io_types": { 00:16:11.893 "read": true, 00:16:11.893 "write": true, 00:16:11.893 "unmap": true, 00:16:11.893 "flush": true, 00:16:11.893 "reset": true, 00:16:11.893 "nvme_admin": false, 00:16:11.893 "nvme_io": false, 00:16:11.893 "nvme_io_md": false, 00:16:11.893 "write_zeroes": true, 00:16:11.893 "zcopy": true, 00:16:11.893 "get_zone_info": false, 00:16:11.893 "zone_management": false, 00:16:11.893 "zone_append": false, 00:16:11.893 "compare": false, 00:16:11.893 "compare_and_write": false, 00:16:11.893 "abort": true, 00:16:11.893 "seek_hole": false, 00:16:11.893 "seek_data": false, 00:16:11.893 "copy": true, 00:16:11.893 "nvme_iov_md": false 00:16:11.893 }, 00:16:11.893 "memory_domains": [ 00:16:11.893 { 00:16:11.893 "dma_device_id": "system", 00:16:11.893 "dma_device_type": 1 00:16:11.893 }, 00:16:11.893 { 00:16:11.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.893 "dma_device_type": 2 00:16:11.893 } 00:16:11.893 ], 00:16:11.893 "driver_specific": { 00:16:11.893 "passthru": { 00:16:11.893 "name": "pt1", 00:16:11.893 "base_bdev_name": "malloc1" 00:16:11.893 } 00:16:11.893 } 00:16:11.893 }' 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:11.893 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.151 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.151 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.152 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.152 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.152 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.152 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:12.152 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:12.152 22:44:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:12.410 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:12.410 "name": "pt2", 00:16:12.410 "aliases": [ 00:16:12.410 "00000000-0000-0000-0000-000000000002" 00:16:12.410 ], 00:16:12.410 "product_name": "passthru", 00:16:12.410 "block_size": 512, 00:16:12.411 "num_blocks": 65536, 00:16:12.411 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:12.411 "assigned_rate_limits": { 00:16:12.411 "rw_ios_per_sec": 0, 00:16:12.411 "rw_mbytes_per_sec": 0, 00:16:12.411 "r_mbytes_per_sec": 0, 00:16:12.411 "w_mbytes_per_sec": 0 00:16:12.411 }, 00:16:12.411 "claimed": true, 00:16:12.411 "claim_type": "exclusive_write", 00:16:12.411 "zoned": false, 00:16:12.411 "supported_io_types": { 00:16:12.411 "read": true, 00:16:12.411 "write": true, 00:16:12.411 "unmap": true, 00:16:12.411 "flush": true, 00:16:12.411 "reset": true, 00:16:12.411 "nvme_admin": false, 00:16:12.411 "nvme_io": false, 00:16:12.411 "nvme_io_md": false, 00:16:12.411 "write_zeroes": true, 00:16:12.411 "zcopy": true, 00:16:12.411 "get_zone_info": false, 00:16:12.411 "zone_management": false, 00:16:12.411 "zone_append": false, 00:16:12.411 "compare": false, 00:16:12.411 "compare_and_write": false, 00:16:12.411 "abort": true, 00:16:12.411 "seek_hole": false, 00:16:12.411 "seek_data": false, 00:16:12.411 "copy": true, 00:16:12.411 "nvme_iov_md": false 00:16:12.411 }, 00:16:12.411 "memory_domains": [ 00:16:12.411 { 00:16:12.411 "dma_device_id": "system", 00:16:12.411 "dma_device_type": 1 00:16:12.411 }, 00:16:12.411 { 00:16:12.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.411 "dma_device_type": 2 00:16:12.411 } 00:16:12.411 ], 00:16:12.411 "driver_specific": { 00:16:12.411 "passthru": { 00:16:12.411 "name": "pt2", 00:16:12.411 "base_bdev_name": "malloc2" 00:16:12.411 } 00:16:12.411 } 00:16:12.411 }' 00:16:12.411 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.411 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.411 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:12.411 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:12.669 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:12.926 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:12.926 "name": "pt3", 00:16:12.926 "aliases": [ 00:16:12.926 "00000000-0000-0000-0000-000000000003" 00:16:12.926 ], 00:16:12.926 "product_name": "passthru", 00:16:12.926 "block_size": 512, 00:16:12.926 "num_blocks": 65536, 00:16:12.926 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:12.926 "assigned_rate_limits": { 00:16:12.926 "rw_ios_per_sec": 0, 00:16:12.926 "rw_mbytes_per_sec": 0, 00:16:12.926 "r_mbytes_per_sec": 0, 00:16:12.926 "w_mbytes_per_sec": 0 00:16:12.926 }, 00:16:12.926 "claimed": true, 00:16:12.926 "claim_type": "exclusive_write", 00:16:12.926 "zoned": false, 00:16:12.926 "supported_io_types": { 00:16:12.926 "read": true, 00:16:12.926 "write": true, 00:16:12.926 "unmap": true, 00:16:12.926 "flush": true, 00:16:12.926 "reset": true, 00:16:12.926 "nvme_admin": false, 00:16:12.926 "nvme_io": false, 00:16:12.926 "nvme_io_md": false, 00:16:12.926 "write_zeroes": true, 00:16:12.926 "zcopy": true, 00:16:12.926 "get_zone_info": false, 00:16:12.926 "zone_management": false, 00:16:12.926 "zone_append": false, 00:16:12.926 "compare": false, 00:16:12.926 "compare_and_write": false, 00:16:12.926 "abort": true, 00:16:12.926 "seek_hole": false, 00:16:12.926 "seek_data": false, 00:16:12.926 "copy": true, 00:16:12.926 "nvme_iov_md": false 00:16:12.926 }, 00:16:12.926 "memory_domains": [ 00:16:12.926 { 00:16:12.926 "dma_device_id": "system", 00:16:12.926 "dma_device_type": 1 00:16:12.926 }, 00:16:12.926 { 00:16:12.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.926 "dma_device_type": 2 00:16:12.926 } 00:16:12.926 ], 00:16:12.927 "driver_specific": { 00:16:12.927 "passthru": { 00:16:12.927 "name": "pt3", 00:16:12.927 "base_bdev_name": "malloc3" 00:16:12.927 } 00:16:12.927 } 00:16:12.927 }' 00:16:12.927 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:12.927 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:13.185 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:13.185 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:13.185 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:13.185 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:13.185 22:44:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:13.185 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:13.185 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:13.185 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:13.185 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:13.443 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:13.443 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:13.443 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:13.701 [2024-07-15 22:44:58.360165] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:13.701 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=48218300-cc23-4787-8205-c62d9336497e 00:16:13.701 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 48218300-cc23-4787-8205-c62d9336497e ']' 00:16:13.701 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:13.960 [2024-07-15 22:44:58.612548] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:13.960 [2024-07-15 22:44:58.612571] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:13.960 [2024-07-15 22:44:58.612628] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:13.960 [2024-07-15 22:44:58.612684] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:13.960 [2024-07-15 22:44:58.612695] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b92ea0 name raid_bdev1, state offline 00:16:13.960 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.960 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:14.218 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:14.218 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:14.218 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:14.218 22:44:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:14.218 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:14.218 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:14.478 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:14.478 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:14.737 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:14.737 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:14.995 22:44:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:15.254 [2024-07-15 22:45:00.068369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:15.254 [2024-07-15 22:45:00.069741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:15.254 [2024-07-15 22:45:00.069785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:15.254 [2024-07-15 22:45:00.069832] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:15.254 [2024-07-15 22:45:00.069874] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:15.254 [2024-07-15 22:45:00.069896] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:15.254 [2024-07-15 22:45:00.069915] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:15.254 [2024-07-15 22:45:00.069931] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d3dff0 name raid_bdev1, state configuring 00:16:15.254 request: 00:16:15.254 { 00:16:15.254 "name": "raid_bdev1", 00:16:15.254 "raid_level": "concat", 00:16:15.254 "base_bdevs": [ 00:16:15.254 "malloc1", 00:16:15.254 "malloc2", 00:16:15.254 "malloc3" 00:16:15.254 ], 00:16:15.254 "strip_size_kb": 64, 00:16:15.254 "superblock": false, 00:16:15.254 "method": "bdev_raid_create", 00:16:15.254 "req_id": 1 00:16:15.254 } 00:16:15.254 Got JSON-RPC error response 00:16:15.254 response: 00:16:15.254 { 00:16:15.254 "code": -17, 00:16:15.254 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:15.254 } 00:16:15.254 22:45:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:15.254 22:45:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:15.254 22:45:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:15.254 22:45:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:15.254 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.254 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:15.513 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:15.513 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:15.513 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:15.772 [2024-07-15 22:45:00.573617] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:15.772 [2024-07-15 22:45:00.573663] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:15.772 [2024-07-15 22:45:00.573684] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b9a7a0 00:16:15.772 [2024-07-15 22:45:00.573697] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:15.772 [2024-07-15 22:45:00.575308] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:15.772 [2024-07-15 22:45:00.575338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:15.772 [2024-07-15 22:45:00.575406] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:15.772 [2024-07-15 22:45:00.575433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:15.772 pt1 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.772 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.773 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.773 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.773 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.773 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:16.031 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.031 "name": "raid_bdev1", 00:16:16.031 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:16.031 "strip_size_kb": 64, 00:16:16.031 "state": "configuring", 00:16:16.031 "raid_level": "concat", 00:16:16.031 "superblock": true, 00:16:16.031 "num_base_bdevs": 3, 00:16:16.031 "num_base_bdevs_discovered": 1, 00:16:16.031 "num_base_bdevs_operational": 3, 00:16:16.031 "base_bdevs_list": [ 00:16:16.031 { 00:16:16.031 "name": "pt1", 00:16:16.031 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:16.031 "is_configured": true, 00:16:16.031 "data_offset": 2048, 00:16:16.031 "data_size": 63488 00:16:16.031 }, 00:16:16.031 { 00:16:16.031 "name": null, 00:16:16.031 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:16.031 "is_configured": false, 00:16:16.031 "data_offset": 2048, 00:16:16.031 "data_size": 63488 00:16:16.031 }, 00:16:16.031 { 00:16:16.031 "name": null, 00:16:16.031 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:16.031 "is_configured": false, 00:16:16.031 "data_offset": 2048, 00:16:16.031 "data_size": 63488 00:16:16.031 } 00:16:16.031 ] 00:16:16.031 }' 00:16:16.031 22:45:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.031 22:45:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.599 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:16.599 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:16.858 [2024-07-15 22:45:01.668653] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:16.858 [2024-07-15 22:45:01.668702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:16.858 [2024-07-15 22:45:01.668721] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b91c70 00:16:16.858 [2024-07-15 22:45:01.668733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:16.858 [2024-07-15 22:45:01.669099] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:16.858 [2024-07-15 22:45:01.669119] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:16.858 [2024-07-15 22:45:01.669185] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:16.858 [2024-07-15 22:45:01.669204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:16.858 pt2 00:16:16.858 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:17.117 [2024-07-15 22:45:01.917317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.117 22:45:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:17.376 22:45:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:17.376 "name": "raid_bdev1", 00:16:17.376 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:17.376 "strip_size_kb": 64, 00:16:17.376 "state": "configuring", 00:16:17.376 "raid_level": "concat", 00:16:17.376 "superblock": true, 00:16:17.376 "num_base_bdevs": 3, 00:16:17.376 "num_base_bdevs_discovered": 1, 00:16:17.376 "num_base_bdevs_operational": 3, 00:16:17.376 "base_bdevs_list": [ 00:16:17.376 { 00:16:17.376 "name": "pt1", 00:16:17.377 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:17.377 "is_configured": true, 00:16:17.377 "data_offset": 2048, 00:16:17.377 "data_size": 63488 00:16:17.377 }, 00:16:17.377 { 00:16:17.377 "name": null, 00:16:17.377 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:17.377 "is_configured": false, 00:16:17.377 "data_offset": 2048, 00:16:17.377 "data_size": 63488 00:16:17.377 }, 00:16:17.377 { 00:16:17.377 "name": null, 00:16:17.377 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:17.377 "is_configured": false, 00:16:17.377 "data_offset": 2048, 00:16:17.377 "data_size": 63488 00:16:17.377 } 00:16:17.377 ] 00:16:17.377 }' 00:16:17.377 22:45:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:17.377 22:45:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.314 22:45:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:18.314 22:45:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:18.314 22:45:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:18.314 [2024-07-15 22:45:03.092427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:18.314 [2024-07-15 22:45:03.092475] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.314 [2024-07-15 22:45:03.092497] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b9aa10 00:16:18.314 [2024-07-15 22:45:03.092509] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.314 [2024-07-15 22:45:03.092854] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.314 [2024-07-15 22:45:03.092871] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:18.314 [2024-07-15 22:45:03.092946] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:18.314 [2024-07-15 22:45:03.092965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:18.314 pt2 00:16:18.314 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:18.314 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:18.314 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:18.572 [2024-07-15 22:45:03.421305] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:18.572 [2024-07-15 22:45:03.421344] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.572 [2024-07-15 22:45:03.421361] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d34740 00:16:18.572 [2024-07-15 22:45:03.421373] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.572 [2024-07-15 22:45:03.421697] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.572 [2024-07-15 22:45:03.421714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:18.572 [2024-07-15 22:45:03.421770] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:18.572 [2024-07-15 22:45:03.421789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:18.572 [2024-07-15 22:45:03.421896] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d34c00 00:16:18.572 [2024-07-15 22:45:03.421906] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:18.572 [2024-07-15 22:45:03.422089] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b99a40 00:16:18.572 [2024-07-15 22:45:03.422213] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d34c00 00:16:18.572 [2024-07-15 22:45:03.422223] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d34c00 00:16:18.572 [2024-07-15 22:45:03.422320] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:18.572 pt3 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.572 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:18.831 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.831 "name": "raid_bdev1", 00:16:18.831 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:18.831 "strip_size_kb": 64, 00:16:18.831 "state": "online", 00:16:18.831 "raid_level": "concat", 00:16:18.831 "superblock": true, 00:16:18.831 "num_base_bdevs": 3, 00:16:18.831 "num_base_bdevs_discovered": 3, 00:16:18.831 "num_base_bdevs_operational": 3, 00:16:18.831 "base_bdevs_list": [ 00:16:18.831 { 00:16:18.831 "name": "pt1", 00:16:18.831 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:18.831 "is_configured": true, 00:16:18.831 "data_offset": 2048, 00:16:18.831 "data_size": 63488 00:16:18.831 }, 00:16:18.831 { 00:16:18.831 "name": "pt2", 00:16:18.831 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:18.831 "is_configured": true, 00:16:18.831 "data_offset": 2048, 00:16:18.831 "data_size": 63488 00:16:18.831 }, 00:16:18.831 { 00:16:18.831 "name": "pt3", 00:16:18.831 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:18.831 "is_configured": true, 00:16:18.831 "data_offset": 2048, 00:16:18.831 "data_size": 63488 00:16:18.831 } 00:16:18.831 ] 00:16:18.831 }' 00:16:18.831 22:45:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.831 22:45:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.415 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:19.415 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:19.415 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:19.415 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:19.673 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:19.673 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:19.673 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:19.673 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:19.673 [2024-07-15 22:45:04.556630] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:19.930 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:19.930 "name": "raid_bdev1", 00:16:19.930 "aliases": [ 00:16:19.930 "48218300-cc23-4787-8205-c62d9336497e" 00:16:19.930 ], 00:16:19.930 "product_name": "Raid Volume", 00:16:19.930 "block_size": 512, 00:16:19.930 "num_blocks": 190464, 00:16:19.930 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:19.930 "assigned_rate_limits": { 00:16:19.930 "rw_ios_per_sec": 0, 00:16:19.930 "rw_mbytes_per_sec": 0, 00:16:19.930 "r_mbytes_per_sec": 0, 00:16:19.930 "w_mbytes_per_sec": 0 00:16:19.930 }, 00:16:19.930 "claimed": false, 00:16:19.930 "zoned": false, 00:16:19.930 "supported_io_types": { 00:16:19.930 "read": true, 00:16:19.930 "write": true, 00:16:19.930 "unmap": true, 00:16:19.930 "flush": true, 00:16:19.930 "reset": true, 00:16:19.930 "nvme_admin": false, 00:16:19.930 "nvme_io": false, 00:16:19.930 "nvme_io_md": false, 00:16:19.930 "write_zeroes": true, 00:16:19.930 "zcopy": false, 00:16:19.930 "get_zone_info": false, 00:16:19.930 "zone_management": false, 00:16:19.930 "zone_append": false, 00:16:19.930 "compare": false, 00:16:19.930 "compare_and_write": false, 00:16:19.930 "abort": false, 00:16:19.930 "seek_hole": false, 00:16:19.930 "seek_data": false, 00:16:19.930 "copy": false, 00:16:19.930 "nvme_iov_md": false 00:16:19.930 }, 00:16:19.930 "memory_domains": [ 00:16:19.930 { 00:16:19.930 "dma_device_id": "system", 00:16:19.930 "dma_device_type": 1 00:16:19.930 }, 00:16:19.930 { 00:16:19.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.930 "dma_device_type": 2 00:16:19.930 }, 00:16:19.930 { 00:16:19.930 "dma_device_id": "system", 00:16:19.930 "dma_device_type": 1 00:16:19.930 }, 00:16:19.930 { 00:16:19.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.930 "dma_device_type": 2 00:16:19.930 }, 00:16:19.930 { 00:16:19.930 "dma_device_id": "system", 00:16:19.930 "dma_device_type": 1 00:16:19.930 }, 00:16:19.930 { 00:16:19.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.930 "dma_device_type": 2 00:16:19.930 } 00:16:19.930 ], 00:16:19.930 "driver_specific": { 00:16:19.930 "raid": { 00:16:19.930 "uuid": "48218300-cc23-4787-8205-c62d9336497e", 00:16:19.930 "strip_size_kb": 64, 00:16:19.930 "state": "online", 00:16:19.930 "raid_level": "concat", 00:16:19.930 "superblock": true, 00:16:19.930 "num_base_bdevs": 3, 00:16:19.930 "num_base_bdevs_discovered": 3, 00:16:19.930 "num_base_bdevs_operational": 3, 00:16:19.930 "base_bdevs_list": [ 00:16:19.931 { 00:16:19.931 "name": "pt1", 00:16:19.931 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:19.931 "is_configured": true, 00:16:19.931 "data_offset": 2048, 00:16:19.931 "data_size": 63488 00:16:19.931 }, 00:16:19.931 { 00:16:19.931 "name": "pt2", 00:16:19.931 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:19.931 "is_configured": true, 00:16:19.931 "data_offset": 2048, 00:16:19.931 "data_size": 63488 00:16:19.931 }, 00:16:19.931 { 00:16:19.931 "name": "pt3", 00:16:19.931 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:19.931 "is_configured": true, 00:16:19.931 "data_offset": 2048, 00:16:19.931 "data_size": 63488 00:16:19.931 } 00:16:19.931 ] 00:16:19.931 } 00:16:19.931 } 00:16:19.931 }' 00:16:19.931 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:19.931 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:19.931 pt2 00:16:19.931 pt3' 00:16:19.931 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:19.931 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:19.931 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:20.189 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:20.189 "name": "pt1", 00:16:20.189 "aliases": [ 00:16:20.189 "00000000-0000-0000-0000-000000000001" 00:16:20.189 ], 00:16:20.189 "product_name": "passthru", 00:16:20.189 "block_size": 512, 00:16:20.189 "num_blocks": 65536, 00:16:20.189 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:20.189 "assigned_rate_limits": { 00:16:20.189 "rw_ios_per_sec": 0, 00:16:20.189 "rw_mbytes_per_sec": 0, 00:16:20.189 "r_mbytes_per_sec": 0, 00:16:20.189 "w_mbytes_per_sec": 0 00:16:20.189 }, 00:16:20.189 "claimed": true, 00:16:20.189 "claim_type": "exclusive_write", 00:16:20.189 "zoned": false, 00:16:20.189 "supported_io_types": { 00:16:20.189 "read": true, 00:16:20.189 "write": true, 00:16:20.189 "unmap": true, 00:16:20.189 "flush": true, 00:16:20.189 "reset": true, 00:16:20.189 "nvme_admin": false, 00:16:20.189 "nvme_io": false, 00:16:20.189 "nvme_io_md": false, 00:16:20.189 "write_zeroes": true, 00:16:20.189 "zcopy": true, 00:16:20.189 "get_zone_info": false, 00:16:20.189 "zone_management": false, 00:16:20.189 "zone_append": false, 00:16:20.189 "compare": false, 00:16:20.189 "compare_and_write": false, 00:16:20.189 "abort": true, 00:16:20.189 "seek_hole": false, 00:16:20.189 "seek_data": false, 00:16:20.189 "copy": true, 00:16:20.189 "nvme_iov_md": false 00:16:20.189 }, 00:16:20.189 "memory_domains": [ 00:16:20.189 { 00:16:20.189 "dma_device_id": "system", 00:16:20.189 "dma_device_type": 1 00:16:20.189 }, 00:16:20.189 { 00:16:20.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.189 "dma_device_type": 2 00:16:20.189 } 00:16:20.189 ], 00:16:20.189 "driver_specific": { 00:16:20.189 "passthru": { 00:16:20.189 "name": "pt1", 00:16:20.189 "base_bdev_name": "malloc1" 00:16:20.189 } 00:16:20.189 } 00:16:20.189 }' 00:16:20.189 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:20.189 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:20.189 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:20.189 22:45:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.189 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.189 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:20.189 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.189 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.447 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:20.447 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.447 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.447 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:20.447 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:20.447 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:20.447 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:20.705 "name": "pt2", 00:16:20.705 "aliases": [ 00:16:20.705 "00000000-0000-0000-0000-000000000002" 00:16:20.705 ], 00:16:20.705 "product_name": "passthru", 00:16:20.705 "block_size": 512, 00:16:20.705 "num_blocks": 65536, 00:16:20.705 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:20.705 "assigned_rate_limits": { 00:16:20.705 "rw_ios_per_sec": 0, 00:16:20.705 "rw_mbytes_per_sec": 0, 00:16:20.705 "r_mbytes_per_sec": 0, 00:16:20.705 "w_mbytes_per_sec": 0 00:16:20.705 }, 00:16:20.705 "claimed": true, 00:16:20.705 "claim_type": "exclusive_write", 00:16:20.705 "zoned": false, 00:16:20.705 "supported_io_types": { 00:16:20.705 "read": true, 00:16:20.705 "write": true, 00:16:20.705 "unmap": true, 00:16:20.705 "flush": true, 00:16:20.705 "reset": true, 00:16:20.705 "nvme_admin": false, 00:16:20.705 "nvme_io": false, 00:16:20.705 "nvme_io_md": false, 00:16:20.705 "write_zeroes": true, 00:16:20.705 "zcopy": true, 00:16:20.705 "get_zone_info": false, 00:16:20.705 "zone_management": false, 00:16:20.705 "zone_append": false, 00:16:20.705 "compare": false, 00:16:20.705 "compare_and_write": false, 00:16:20.705 "abort": true, 00:16:20.705 "seek_hole": false, 00:16:20.705 "seek_data": false, 00:16:20.705 "copy": true, 00:16:20.705 "nvme_iov_md": false 00:16:20.705 }, 00:16:20.705 "memory_domains": [ 00:16:20.705 { 00:16:20.705 "dma_device_id": "system", 00:16:20.705 "dma_device_type": 1 00:16:20.705 }, 00:16:20.705 { 00:16:20.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.705 "dma_device_type": 2 00:16:20.705 } 00:16:20.705 ], 00:16:20.705 "driver_specific": { 00:16:20.705 "passthru": { 00:16:20.705 "name": "pt2", 00:16:20.705 "base_bdev_name": "malloc2" 00:16:20.705 } 00:16:20.705 } 00:16:20.705 }' 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:20.705 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:20.963 22:45:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:21.220 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:21.220 "name": "pt3", 00:16:21.220 "aliases": [ 00:16:21.220 "00000000-0000-0000-0000-000000000003" 00:16:21.220 ], 00:16:21.220 "product_name": "passthru", 00:16:21.220 "block_size": 512, 00:16:21.220 "num_blocks": 65536, 00:16:21.220 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:21.220 "assigned_rate_limits": { 00:16:21.220 "rw_ios_per_sec": 0, 00:16:21.220 "rw_mbytes_per_sec": 0, 00:16:21.220 "r_mbytes_per_sec": 0, 00:16:21.220 "w_mbytes_per_sec": 0 00:16:21.220 }, 00:16:21.220 "claimed": true, 00:16:21.220 "claim_type": "exclusive_write", 00:16:21.220 "zoned": false, 00:16:21.220 "supported_io_types": { 00:16:21.220 "read": true, 00:16:21.220 "write": true, 00:16:21.221 "unmap": true, 00:16:21.221 "flush": true, 00:16:21.221 "reset": true, 00:16:21.221 "nvme_admin": false, 00:16:21.221 "nvme_io": false, 00:16:21.221 "nvme_io_md": false, 00:16:21.221 "write_zeroes": true, 00:16:21.221 "zcopy": true, 00:16:21.221 "get_zone_info": false, 00:16:21.221 "zone_management": false, 00:16:21.221 "zone_append": false, 00:16:21.221 "compare": false, 00:16:21.221 "compare_and_write": false, 00:16:21.221 "abort": true, 00:16:21.221 "seek_hole": false, 00:16:21.221 "seek_data": false, 00:16:21.221 "copy": true, 00:16:21.221 "nvme_iov_md": false 00:16:21.221 }, 00:16:21.221 "memory_domains": [ 00:16:21.221 { 00:16:21.221 "dma_device_id": "system", 00:16:21.221 "dma_device_type": 1 00:16:21.221 }, 00:16:21.221 { 00:16:21.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.221 "dma_device_type": 2 00:16:21.221 } 00:16:21.221 ], 00:16:21.221 "driver_specific": { 00:16:21.221 "passthru": { 00:16:21.221 "name": "pt3", 00:16:21.221 "base_bdev_name": "malloc3" 00:16:21.221 } 00:16:21.221 } 00:16:21.221 }' 00:16:21.221 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.221 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:21.221 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:21.221 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:21.479 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:21.736 [2024-07-15 22:45:06.602058] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:21.736 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 48218300-cc23-4787-8205-c62d9336497e '!=' 48218300-cc23-4787-8205-c62d9336497e ']' 00:16:21.736 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:21.736 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:21.737 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:21.737 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2741275 00:16:21.737 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2741275 ']' 00:16:21.737 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2741275 00:16:21.737 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:21.737 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:21.737 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2741275 00:16:21.994 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:21.994 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:21.994 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2741275' 00:16:21.994 killing process with pid 2741275 00:16:21.994 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2741275 00:16:21.994 [2024-07-15 22:45:06.671453] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:21.994 [2024-07-15 22:45:06.671510] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:21.994 [2024-07-15 22:45:06.671570] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:21.994 [2024-07-15 22:45:06.671582] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d34c00 name raid_bdev1, state offline 00:16:21.994 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2741275 00:16:21.994 [2024-07-15 22:45:06.698339] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:22.251 22:45:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:22.251 00:16:22.251 real 0m14.367s 00:16:22.251 user 0m25.482s 00:16:22.251 sys 0m2.662s 00:16:22.251 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:22.251 22:45:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.251 ************************************ 00:16:22.251 END TEST raid_superblock_test 00:16:22.251 ************************************ 00:16:22.251 22:45:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:22.251 22:45:06 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:16:22.251 22:45:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:22.251 22:45:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:22.251 22:45:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:22.251 ************************************ 00:16:22.251 START TEST raid_read_error_test 00:16:22.251 ************************************ 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:22.251 22:45:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.oJaBszWp88 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2743994 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2743994 /var/tmp/spdk-raid.sock 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2743994 ']' 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:22.251 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:22.251 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.251 [2024-07-15 22:45:07.067389] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:16:22.251 [2024-07-15 22:45:07.067454] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2743994 ] 00:16:22.509 [2024-07-15 22:45:07.197349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:22.509 [2024-07-15 22:45:07.303303] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:22.509 [2024-07-15 22:45:07.370718] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:22.509 [2024-07-15 22:45:07.370763] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:23.451 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:23.451 22:45:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:23.451 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:23.451 22:45:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:23.754 BaseBdev1_malloc 00:16:23.754 22:45:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:24.051 true 00:16:24.051 22:45:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:24.051 [2024-07-15 22:45:08.954423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:24.051 [2024-07-15 22:45:08.954466] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:24.051 [2024-07-15 22:45:08.954486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15450d0 00:16:24.051 [2024-07-15 22:45:08.954498] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:24.051 [2024-07-15 22:45:08.956353] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:24.051 [2024-07-15 22:45:08.956382] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:24.051 BaseBdev1 00:16:24.309 22:45:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:24.309 22:45:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:24.309 BaseBdev2_malloc 00:16:24.666 22:45:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:24.666 true 00:16:24.666 22:45:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:24.925 [2024-07-15 22:45:09.694202] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:24.925 [2024-07-15 22:45:09.694246] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:24.925 [2024-07-15 22:45:09.694267] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1549910 00:16:24.925 [2024-07-15 22:45:09.694280] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:24.925 [2024-07-15 22:45:09.695856] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:24.925 [2024-07-15 22:45:09.695883] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:24.925 BaseBdev2 00:16:24.925 22:45:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:24.925 22:45:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:25.184 BaseBdev3_malloc 00:16:25.184 22:45:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:25.443 true 00:16:25.443 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:25.702 [2024-07-15 22:45:10.420662] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:25.702 [2024-07-15 22:45:10.420705] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.702 [2024-07-15 22:45:10.420725] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x154bbd0 00:16:25.702 [2024-07-15 22:45:10.420739] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.702 [2024-07-15 22:45:10.422307] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.702 [2024-07-15 22:45:10.422335] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:25.702 BaseBdev3 00:16:25.702 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:25.962 [2024-07-15 22:45:10.665350] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:25.962 [2024-07-15 22:45:10.666756] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:25.962 [2024-07-15 22:45:10.666828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:25.962 [2024-07-15 22:45:10.667051] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x154d280 00:16:25.962 [2024-07-15 22:45:10.667063] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:25.962 [2024-07-15 22:45:10.667272] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x154ce20 00:16:25.962 [2024-07-15 22:45:10.667419] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x154d280 00:16:25.962 [2024-07-15 22:45:10.667429] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x154d280 00:16:25.962 [2024-07-15 22:45:10.667536] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.962 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:26.221 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.221 "name": "raid_bdev1", 00:16:26.221 "uuid": "0e0e4482-91d9-4a6a-97a5-613da6da3b53", 00:16:26.221 "strip_size_kb": 64, 00:16:26.221 "state": "online", 00:16:26.221 "raid_level": "concat", 00:16:26.221 "superblock": true, 00:16:26.221 "num_base_bdevs": 3, 00:16:26.221 "num_base_bdevs_discovered": 3, 00:16:26.221 "num_base_bdevs_operational": 3, 00:16:26.221 "base_bdevs_list": [ 00:16:26.221 { 00:16:26.221 "name": "BaseBdev1", 00:16:26.221 "uuid": "740d5e82-7445-5cee-8b10-45ccf71d65ef", 00:16:26.221 "is_configured": true, 00:16:26.221 "data_offset": 2048, 00:16:26.221 "data_size": 63488 00:16:26.221 }, 00:16:26.221 { 00:16:26.221 "name": "BaseBdev2", 00:16:26.221 "uuid": "f255eb10-07b5-589d-a0ec-4cc84c2b9015", 00:16:26.221 "is_configured": true, 00:16:26.221 "data_offset": 2048, 00:16:26.221 "data_size": 63488 00:16:26.221 }, 00:16:26.221 { 00:16:26.221 "name": "BaseBdev3", 00:16:26.221 "uuid": "463096f0-cdd4-5d73-9a9c-3c55df5a9d37", 00:16:26.221 "is_configured": true, 00:16:26.221 "data_offset": 2048, 00:16:26.221 "data_size": 63488 00:16:26.221 } 00:16:26.221 ] 00:16:26.221 }' 00:16:26.221 22:45:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.221 22:45:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.788 22:45:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:26.788 22:45:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:26.788 [2024-07-15 22:45:11.632193] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x139b4d0 00:16:27.758 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.016 22:45:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:28.275 22:45:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.275 "name": "raid_bdev1", 00:16:28.275 "uuid": "0e0e4482-91d9-4a6a-97a5-613da6da3b53", 00:16:28.275 "strip_size_kb": 64, 00:16:28.275 "state": "online", 00:16:28.275 "raid_level": "concat", 00:16:28.275 "superblock": true, 00:16:28.275 "num_base_bdevs": 3, 00:16:28.275 "num_base_bdevs_discovered": 3, 00:16:28.275 "num_base_bdevs_operational": 3, 00:16:28.275 "base_bdevs_list": [ 00:16:28.275 { 00:16:28.275 "name": "BaseBdev1", 00:16:28.275 "uuid": "740d5e82-7445-5cee-8b10-45ccf71d65ef", 00:16:28.275 "is_configured": true, 00:16:28.275 "data_offset": 2048, 00:16:28.275 "data_size": 63488 00:16:28.275 }, 00:16:28.275 { 00:16:28.275 "name": "BaseBdev2", 00:16:28.275 "uuid": "f255eb10-07b5-589d-a0ec-4cc84c2b9015", 00:16:28.275 "is_configured": true, 00:16:28.275 "data_offset": 2048, 00:16:28.275 "data_size": 63488 00:16:28.275 }, 00:16:28.275 { 00:16:28.275 "name": "BaseBdev3", 00:16:28.275 "uuid": "463096f0-cdd4-5d73-9a9c-3c55df5a9d37", 00:16:28.275 "is_configured": true, 00:16:28.275 "data_offset": 2048, 00:16:28.275 "data_size": 63488 00:16:28.275 } 00:16:28.275 ] 00:16:28.275 }' 00:16:28.275 22:45:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.275 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.842 22:45:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:29.100 [2024-07-15 22:45:13.848953] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:29.100 [2024-07-15 22:45:13.848989] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:29.100 [2024-07-15 22:45:13.852162] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:29.100 [2024-07-15 22:45:13.852201] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:29.100 [2024-07-15 22:45:13.852235] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:29.100 [2024-07-15 22:45:13.852247] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x154d280 name raid_bdev1, state offline 00:16:29.100 0 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2743994 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2743994 ']' 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2743994 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2743994 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2743994' 00:16:29.100 killing process with pid 2743994 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2743994 00:16:29.100 [2024-07-15 22:45:13.921714] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:29.100 22:45:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2743994 00:16:29.100 [2024-07-15 22:45:13.945991] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.oJaBszWp88 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:16:29.359 00:16:29.359 real 0m7.199s 00:16:29.359 user 0m11.497s 00:16:29.359 sys 0m1.223s 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:29.359 22:45:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.359 ************************************ 00:16:29.359 END TEST raid_read_error_test 00:16:29.359 ************************************ 00:16:29.359 22:45:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:29.359 22:45:14 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:16:29.359 22:45:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:29.359 22:45:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:29.359 22:45:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:29.618 ************************************ 00:16:29.618 START TEST raid_write_error_test 00:16:29.618 ************************************ 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yVzKSvMoOH 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2744978 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2744978 /var/tmp/spdk-raid.sock 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2744978 ']' 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:29.618 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:29.618 22:45:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.618 [2024-07-15 22:45:14.361246] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:16:29.618 [2024-07-15 22:45:14.361321] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2744978 ] 00:16:29.618 [2024-07-15 22:45:14.492996] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:29.877 [2024-07-15 22:45:14.595992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:29.877 [2024-07-15 22:45:14.658829] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:29.877 [2024-07-15 22:45:14.658860] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:30.445 22:45:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:30.445 22:45:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:30.445 22:45:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:30.445 22:45:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:30.704 BaseBdev1_malloc 00:16:30.704 22:45:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:30.963 true 00:16:30.963 22:45:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:31.221 [2024-07-15 22:45:15.896933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:31.221 [2024-07-15 22:45:15.896980] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:31.221 [2024-07-15 22:45:15.897000] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a000d0 00:16:31.221 [2024-07-15 22:45:15.897013] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:31.221 [2024-07-15 22:45:15.898703] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:31.221 [2024-07-15 22:45:15.898730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:31.221 BaseBdev1 00:16:31.221 22:45:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:31.221 22:45:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:31.480 BaseBdev2_malloc 00:16:31.480 22:45:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:31.739 true 00:16:31.739 22:45:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:31.998 [2024-07-15 22:45:16.651554] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:31.998 [2024-07-15 22:45:16.651598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:31.998 [2024-07-15 22:45:16.651617] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a04910 00:16:31.998 [2024-07-15 22:45:16.651629] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:31.998 [2024-07-15 22:45:16.653048] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:31.998 [2024-07-15 22:45:16.653075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:31.998 BaseBdev2 00:16:31.998 22:45:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:31.998 22:45:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:32.256 BaseBdev3_malloc 00:16:32.256 22:45:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:32.256 true 00:16:32.529 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:32.530 [2024-07-15 22:45:17.402182] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:32.530 [2024-07-15 22:45:17.402228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:32.530 [2024-07-15 22:45:17.402249] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a06bd0 00:16:32.530 [2024-07-15 22:45:17.402261] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:32.530 [2024-07-15 22:45:17.403752] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:32.530 [2024-07-15 22:45:17.403780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:32.530 BaseBdev3 00:16:32.530 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:32.788 [2024-07-15 22:45:17.646852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:32.788 [2024-07-15 22:45:17.648142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:32.788 [2024-07-15 22:45:17.648210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:32.788 [2024-07-15 22:45:17.648418] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a08280 00:16:32.788 [2024-07-15 22:45:17.648430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:32.788 [2024-07-15 22:45:17.648626] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a07e20 00:16:32.788 [2024-07-15 22:45:17.648774] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a08280 00:16:32.788 [2024-07-15 22:45:17.648784] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a08280 00:16:32.788 [2024-07-15 22:45:17.648885] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:32.788 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:32.788 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:32.788 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:32.788 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.789 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:33.048 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.048 "name": "raid_bdev1", 00:16:33.048 "uuid": "f0a4cec6-1a1f-4cfa-827f-436a5ee4a671", 00:16:33.048 "strip_size_kb": 64, 00:16:33.048 "state": "online", 00:16:33.048 "raid_level": "concat", 00:16:33.048 "superblock": true, 00:16:33.048 "num_base_bdevs": 3, 00:16:33.048 "num_base_bdevs_discovered": 3, 00:16:33.048 "num_base_bdevs_operational": 3, 00:16:33.048 "base_bdevs_list": [ 00:16:33.048 { 00:16:33.048 "name": "BaseBdev1", 00:16:33.048 "uuid": "5a3e9607-6b22-5a6e-8dea-c01dd18b4705", 00:16:33.048 "is_configured": true, 00:16:33.048 "data_offset": 2048, 00:16:33.048 "data_size": 63488 00:16:33.048 }, 00:16:33.048 { 00:16:33.048 "name": "BaseBdev2", 00:16:33.048 "uuid": "ea7dd187-e4cf-5540-a571-51d79c760998", 00:16:33.048 "is_configured": true, 00:16:33.048 "data_offset": 2048, 00:16:33.048 "data_size": 63488 00:16:33.048 }, 00:16:33.048 { 00:16:33.048 "name": "BaseBdev3", 00:16:33.048 "uuid": "3e8d1bbe-425d-50db-af6a-59ff9a584a88", 00:16:33.048 "is_configured": true, 00:16:33.048 "data_offset": 2048, 00:16:33.048 "data_size": 63488 00:16:33.048 } 00:16:33.048 ] 00:16:33.048 }' 00:16:33.048 22:45:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.048 22:45:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.615 22:45:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:33.615 22:45:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:33.872 [2024-07-15 22:45:18.565586] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18564d0 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.806 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.064 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.064 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:35.064 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.064 "name": "raid_bdev1", 00:16:35.064 "uuid": "f0a4cec6-1a1f-4cfa-827f-436a5ee4a671", 00:16:35.064 "strip_size_kb": 64, 00:16:35.064 "state": "online", 00:16:35.064 "raid_level": "concat", 00:16:35.064 "superblock": true, 00:16:35.064 "num_base_bdevs": 3, 00:16:35.064 "num_base_bdevs_discovered": 3, 00:16:35.064 "num_base_bdevs_operational": 3, 00:16:35.064 "base_bdevs_list": [ 00:16:35.064 { 00:16:35.064 "name": "BaseBdev1", 00:16:35.064 "uuid": "5a3e9607-6b22-5a6e-8dea-c01dd18b4705", 00:16:35.064 "is_configured": true, 00:16:35.064 "data_offset": 2048, 00:16:35.064 "data_size": 63488 00:16:35.064 }, 00:16:35.064 { 00:16:35.064 "name": "BaseBdev2", 00:16:35.064 "uuid": "ea7dd187-e4cf-5540-a571-51d79c760998", 00:16:35.064 "is_configured": true, 00:16:35.064 "data_offset": 2048, 00:16:35.064 "data_size": 63488 00:16:35.064 }, 00:16:35.064 { 00:16:35.064 "name": "BaseBdev3", 00:16:35.064 "uuid": "3e8d1bbe-425d-50db-af6a-59ff9a584a88", 00:16:35.064 "is_configured": true, 00:16:35.064 "data_offset": 2048, 00:16:35.064 "data_size": 63488 00:16:35.064 } 00:16:35.064 ] 00:16:35.064 }' 00:16:35.064 22:45:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.064 22:45:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:36.000 [2024-07-15 22:45:20.738739] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:36.000 [2024-07-15 22:45:20.738783] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:36.000 [2024-07-15 22:45:20.741961] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:36.000 [2024-07-15 22:45:20.741999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:36.000 [2024-07-15 22:45:20.742035] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:36.000 [2024-07-15 22:45:20.742046] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a08280 name raid_bdev1, state offline 00:16:36.000 0 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2744978 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2744978 ']' 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2744978 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2744978 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2744978' 00:16:36.000 killing process with pid 2744978 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2744978 00:16:36.000 [2024-07-15 22:45:20.811813] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:36.000 22:45:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2744978 00:16:36.000 [2024-07-15 22:45:20.831776] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yVzKSvMoOH 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:16:36.260 00:16:36.260 real 0m6.781s 00:16:36.260 user 0m10.655s 00:16:36.260 sys 0m1.243s 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:36.260 22:45:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.260 ************************************ 00:16:36.260 END TEST raid_write_error_test 00:16:36.260 ************************************ 00:16:36.260 22:45:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:36.260 22:45:21 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:36.260 22:45:21 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:36.260 22:45:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:36.260 22:45:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:36.260 22:45:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:36.260 ************************************ 00:16:36.260 START TEST raid_state_function_test 00:16:36.260 ************************************ 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2745954 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2745954' 00:16:36.260 Process raid pid: 2745954 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2745954 /var/tmp/spdk-raid.sock 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2745954 ']' 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:36.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:36.260 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.518 [2024-07-15 22:45:21.217692] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:16:36.518 [2024-07-15 22:45:21.217747] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:36.518 [2024-07-15 22:45:21.322803] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:36.518 [2024-07-15 22:45:21.427284] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:36.776 [2024-07-15 22:45:21.497129] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:36.776 [2024-07-15 22:45:21.497166] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:36.776 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:36.776 22:45:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:36.776 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:37.035 [2024-07-15 22:45:21.784030] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:37.035 [2024-07-15 22:45:21.784075] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:37.035 [2024-07-15 22:45:21.784086] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:37.035 [2024-07-15 22:45:21.784098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:37.035 [2024-07-15 22:45:21.784111] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:37.035 [2024-07-15 22:45:21.784122] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.035 22:45:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.293 22:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.293 "name": "Existed_Raid", 00:16:37.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.293 "strip_size_kb": 0, 00:16:37.293 "state": "configuring", 00:16:37.293 "raid_level": "raid1", 00:16:37.293 "superblock": false, 00:16:37.293 "num_base_bdevs": 3, 00:16:37.293 "num_base_bdevs_discovered": 0, 00:16:37.293 "num_base_bdevs_operational": 3, 00:16:37.293 "base_bdevs_list": [ 00:16:37.293 { 00:16:37.293 "name": "BaseBdev1", 00:16:37.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.293 "is_configured": false, 00:16:37.293 "data_offset": 0, 00:16:37.293 "data_size": 0 00:16:37.293 }, 00:16:37.293 { 00:16:37.293 "name": "BaseBdev2", 00:16:37.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.293 "is_configured": false, 00:16:37.293 "data_offset": 0, 00:16:37.293 "data_size": 0 00:16:37.293 }, 00:16:37.293 { 00:16:37.293 "name": "BaseBdev3", 00:16:37.293 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.293 "is_configured": false, 00:16:37.293 "data_offset": 0, 00:16:37.293 "data_size": 0 00:16:37.293 } 00:16:37.293 ] 00:16:37.293 }' 00:16:37.293 22:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.293 22:45:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:37.861 22:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:38.119 [2024-07-15 22:45:22.866757] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:38.119 [2024-07-15 22:45:22.866797] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15daa80 name Existed_Raid, state configuring 00:16:38.119 22:45:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:38.378 [2024-07-15 22:45:23.111416] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:38.378 [2024-07-15 22:45:23.111457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:38.378 [2024-07-15 22:45:23.111467] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:38.378 [2024-07-15 22:45:23.111479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:38.378 [2024-07-15 22:45:23.111487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:38.378 [2024-07-15 22:45:23.111499] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:38.378 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:38.636 [2024-07-15 22:45:23.367258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:38.636 BaseBdev1 00:16:38.636 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:38.636 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:38.636 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:38.636 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:38.636 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:38.636 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:38.636 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.894 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:39.175 [ 00:16:39.175 { 00:16:39.175 "name": "BaseBdev1", 00:16:39.175 "aliases": [ 00:16:39.175 "616c2bf3-3e63-4fb3-bbad-290e57d008f2" 00:16:39.175 ], 00:16:39.175 "product_name": "Malloc disk", 00:16:39.175 "block_size": 512, 00:16:39.175 "num_blocks": 65536, 00:16:39.175 "uuid": "616c2bf3-3e63-4fb3-bbad-290e57d008f2", 00:16:39.175 "assigned_rate_limits": { 00:16:39.175 "rw_ios_per_sec": 0, 00:16:39.175 "rw_mbytes_per_sec": 0, 00:16:39.175 "r_mbytes_per_sec": 0, 00:16:39.175 "w_mbytes_per_sec": 0 00:16:39.175 }, 00:16:39.175 "claimed": true, 00:16:39.175 "claim_type": "exclusive_write", 00:16:39.175 "zoned": false, 00:16:39.175 "supported_io_types": { 00:16:39.175 "read": true, 00:16:39.175 "write": true, 00:16:39.175 "unmap": true, 00:16:39.175 "flush": true, 00:16:39.175 "reset": true, 00:16:39.175 "nvme_admin": false, 00:16:39.175 "nvme_io": false, 00:16:39.175 "nvme_io_md": false, 00:16:39.175 "write_zeroes": true, 00:16:39.175 "zcopy": true, 00:16:39.175 "get_zone_info": false, 00:16:39.175 "zone_management": false, 00:16:39.175 "zone_append": false, 00:16:39.175 "compare": false, 00:16:39.175 "compare_and_write": false, 00:16:39.175 "abort": true, 00:16:39.175 "seek_hole": false, 00:16:39.175 "seek_data": false, 00:16:39.175 "copy": true, 00:16:39.175 "nvme_iov_md": false 00:16:39.175 }, 00:16:39.175 "memory_domains": [ 00:16:39.175 { 00:16:39.175 "dma_device_id": "system", 00:16:39.175 "dma_device_type": 1 00:16:39.175 }, 00:16:39.175 { 00:16:39.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.175 "dma_device_type": 2 00:16:39.175 } 00:16:39.175 ], 00:16:39.175 "driver_specific": {} 00:16:39.175 } 00:16:39.175 ] 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.175 22:45:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.434 22:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.434 "name": "Existed_Raid", 00:16:39.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.434 "strip_size_kb": 0, 00:16:39.434 "state": "configuring", 00:16:39.434 "raid_level": "raid1", 00:16:39.434 "superblock": false, 00:16:39.434 "num_base_bdevs": 3, 00:16:39.434 "num_base_bdevs_discovered": 1, 00:16:39.434 "num_base_bdevs_operational": 3, 00:16:39.434 "base_bdevs_list": [ 00:16:39.434 { 00:16:39.434 "name": "BaseBdev1", 00:16:39.434 "uuid": "616c2bf3-3e63-4fb3-bbad-290e57d008f2", 00:16:39.434 "is_configured": true, 00:16:39.434 "data_offset": 0, 00:16:39.434 "data_size": 65536 00:16:39.434 }, 00:16:39.434 { 00:16:39.434 "name": "BaseBdev2", 00:16:39.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.434 "is_configured": false, 00:16:39.434 "data_offset": 0, 00:16:39.434 "data_size": 0 00:16:39.434 }, 00:16:39.434 { 00:16:39.434 "name": "BaseBdev3", 00:16:39.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.434 "is_configured": false, 00:16:39.434 "data_offset": 0, 00:16:39.434 "data_size": 0 00:16:39.434 } 00:16:39.434 ] 00:16:39.434 }' 00:16:39.434 22:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.434 22:45:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.998 22:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:40.256 [2024-07-15 22:45:24.927297] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:40.256 [2024-07-15 22:45:24.927345] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15da310 name Existed_Raid, state configuring 00:16:40.256 22:45:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:40.513 [2024-07-15 22:45:25.175987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:40.513 [2024-07-15 22:45:25.177434] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:40.513 [2024-07-15 22:45:25.177481] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:40.513 [2024-07-15 22:45:25.177492] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:40.513 [2024-07-15 22:45:25.177504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.513 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.770 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.770 "name": "Existed_Raid", 00:16:40.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.770 "strip_size_kb": 0, 00:16:40.770 "state": "configuring", 00:16:40.770 "raid_level": "raid1", 00:16:40.770 "superblock": false, 00:16:40.770 "num_base_bdevs": 3, 00:16:40.770 "num_base_bdevs_discovered": 1, 00:16:40.770 "num_base_bdevs_operational": 3, 00:16:40.770 "base_bdevs_list": [ 00:16:40.770 { 00:16:40.770 "name": "BaseBdev1", 00:16:40.770 "uuid": "616c2bf3-3e63-4fb3-bbad-290e57d008f2", 00:16:40.770 "is_configured": true, 00:16:40.770 "data_offset": 0, 00:16:40.770 "data_size": 65536 00:16:40.770 }, 00:16:40.770 { 00:16:40.770 "name": "BaseBdev2", 00:16:40.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.771 "is_configured": false, 00:16:40.771 "data_offset": 0, 00:16:40.771 "data_size": 0 00:16:40.771 }, 00:16:40.771 { 00:16:40.771 "name": "BaseBdev3", 00:16:40.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.771 "is_configured": false, 00:16:40.771 "data_offset": 0, 00:16:40.771 "data_size": 0 00:16:40.771 } 00:16:40.771 ] 00:16:40.771 }' 00:16:40.771 22:45:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.771 22:45:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.336 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:41.594 [2024-07-15 22:45:26.290606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:41.594 BaseBdev2 00:16:41.594 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:41.594 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:41.594 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:41.594 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:41.594 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:41.594 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:41.594 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:41.851 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:42.108 [ 00:16:42.108 { 00:16:42.108 "name": "BaseBdev2", 00:16:42.108 "aliases": [ 00:16:42.108 "6feabcb7-47cd-4d2e-af00-5f50af2e15ed" 00:16:42.108 ], 00:16:42.108 "product_name": "Malloc disk", 00:16:42.108 "block_size": 512, 00:16:42.108 "num_blocks": 65536, 00:16:42.108 "uuid": "6feabcb7-47cd-4d2e-af00-5f50af2e15ed", 00:16:42.108 "assigned_rate_limits": { 00:16:42.108 "rw_ios_per_sec": 0, 00:16:42.108 "rw_mbytes_per_sec": 0, 00:16:42.108 "r_mbytes_per_sec": 0, 00:16:42.108 "w_mbytes_per_sec": 0 00:16:42.108 }, 00:16:42.108 "claimed": true, 00:16:42.108 "claim_type": "exclusive_write", 00:16:42.108 "zoned": false, 00:16:42.108 "supported_io_types": { 00:16:42.108 "read": true, 00:16:42.108 "write": true, 00:16:42.108 "unmap": true, 00:16:42.108 "flush": true, 00:16:42.108 "reset": true, 00:16:42.108 "nvme_admin": false, 00:16:42.108 "nvme_io": false, 00:16:42.108 "nvme_io_md": false, 00:16:42.108 "write_zeroes": true, 00:16:42.108 "zcopy": true, 00:16:42.108 "get_zone_info": false, 00:16:42.108 "zone_management": false, 00:16:42.108 "zone_append": false, 00:16:42.108 "compare": false, 00:16:42.108 "compare_and_write": false, 00:16:42.108 "abort": true, 00:16:42.108 "seek_hole": false, 00:16:42.108 "seek_data": false, 00:16:42.108 "copy": true, 00:16:42.108 "nvme_iov_md": false 00:16:42.108 }, 00:16:42.108 "memory_domains": [ 00:16:42.108 { 00:16:42.108 "dma_device_id": "system", 00:16:42.108 "dma_device_type": 1 00:16:42.108 }, 00:16:42.108 { 00:16:42.108 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.108 "dma_device_type": 2 00:16:42.108 } 00:16:42.108 ], 00:16:42.108 "driver_specific": {} 00:16:42.108 } 00:16:42.108 ] 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.108 22:45:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.366 22:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.366 "name": "Existed_Raid", 00:16:42.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.366 "strip_size_kb": 0, 00:16:42.366 "state": "configuring", 00:16:42.366 "raid_level": "raid1", 00:16:42.366 "superblock": false, 00:16:42.366 "num_base_bdevs": 3, 00:16:42.366 "num_base_bdevs_discovered": 2, 00:16:42.366 "num_base_bdevs_operational": 3, 00:16:42.366 "base_bdevs_list": [ 00:16:42.366 { 00:16:42.366 "name": "BaseBdev1", 00:16:42.366 "uuid": "616c2bf3-3e63-4fb3-bbad-290e57d008f2", 00:16:42.366 "is_configured": true, 00:16:42.366 "data_offset": 0, 00:16:42.366 "data_size": 65536 00:16:42.366 }, 00:16:42.366 { 00:16:42.366 "name": "BaseBdev2", 00:16:42.366 "uuid": "6feabcb7-47cd-4d2e-af00-5f50af2e15ed", 00:16:42.366 "is_configured": true, 00:16:42.366 "data_offset": 0, 00:16:42.366 "data_size": 65536 00:16:42.366 }, 00:16:42.366 { 00:16:42.366 "name": "BaseBdev3", 00:16:42.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.366 "is_configured": false, 00:16:42.366 "data_offset": 0, 00:16:42.366 "data_size": 0 00:16:42.366 } 00:16:42.366 ] 00:16:42.366 }' 00:16:42.366 22:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.366 22:45:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.933 22:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:42.933 [2024-07-15 22:45:27.810059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:42.933 [2024-07-15 22:45:27.810104] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15db400 00:16:42.933 [2024-07-15 22:45:27.810113] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:42.933 [2024-07-15 22:45:27.810369] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15daef0 00:16:42.933 [2024-07-15 22:45:27.810495] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15db400 00:16:42.933 [2024-07-15 22:45:27.810505] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15db400 00:16:42.933 [2024-07-15 22:45:27.810672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:42.933 BaseBdev3 00:16:42.933 22:45:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:42.933 22:45:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:42.933 22:45:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:42.933 22:45:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:42.933 22:45:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:42.934 22:45:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:42.934 22:45:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:43.191 22:45:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:43.450 [ 00:16:43.450 { 00:16:43.450 "name": "BaseBdev3", 00:16:43.450 "aliases": [ 00:16:43.450 "120bc73d-671b-4ce6-b10e-645e8207802e" 00:16:43.450 ], 00:16:43.450 "product_name": "Malloc disk", 00:16:43.450 "block_size": 512, 00:16:43.450 "num_blocks": 65536, 00:16:43.450 "uuid": "120bc73d-671b-4ce6-b10e-645e8207802e", 00:16:43.450 "assigned_rate_limits": { 00:16:43.450 "rw_ios_per_sec": 0, 00:16:43.450 "rw_mbytes_per_sec": 0, 00:16:43.450 "r_mbytes_per_sec": 0, 00:16:43.450 "w_mbytes_per_sec": 0 00:16:43.450 }, 00:16:43.450 "claimed": true, 00:16:43.450 "claim_type": "exclusive_write", 00:16:43.450 "zoned": false, 00:16:43.450 "supported_io_types": { 00:16:43.450 "read": true, 00:16:43.450 "write": true, 00:16:43.450 "unmap": true, 00:16:43.450 "flush": true, 00:16:43.450 "reset": true, 00:16:43.450 "nvme_admin": false, 00:16:43.450 "nvme_io": false, 00:16:43.450 "nvme_io_md": false, 00:16:43.450 "write_zeroes": true, 00:16:43.450 "zcopy": true, 00:16:43.450 "get_zone_info": false, 00:16:43.450 "zone_management": false, 00:16:43.450 "zone_append": false, 00:16:43.450 "compare": false, 00:16:43.450 "compare_and_write": false, 00:16:43.450 "abort": true, 00:16:43.450 "seek_hole": false, 00:16:43.450 "seek_data": false, 00:16:43.450 "copy": true, 00:16:43.450 "nvme_iov_md": false 00:16:43.450 }, 00:16:43.450 "memory_domains": [ 00:16:43.450 { 00:16:43.450 "dma_device_id": "system", 00:16:43.450 "dma_device_type": 1 00:16:43.450 }, 00:16:43.450 { 00:16:43.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.450 "dma_device_type": 2 00:16:43.450 } 00:16:43.450 ], 00:16:43.450 "driver_specific": {} 00:16:43.450 } 00:16:43.450 ] 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.450 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.708 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.708 "name": "Existed_Raid", 00:16:43.708 "uuid": "15137a4d-70e7-4f91-a4ae-1349079f430b", 00:16:43.708 "strip_size_kb": 0, 00:16:43.708 "state": "online", 00:16:43.708 "raid_level": "raid1", 00:16:43.708 "superblock": false, 00:16:43.708 "num_base_bdevs": 3, 00:16:43.708 "num_base_bdevs_discovered": 3, 00:16:43.708 "num_base_bdevs_operational": 3, 00:16:43.708 "base_bdevs_list": [ 00:16:43.708 { 00:16:43.708 "name": "BaseBdev1", 00:16:43.708 "uuid": "616c2bf3-3e63-4fb3-bbad-290e57d008f2", 00:16:43.708 "is_configured": true, 00:16:43.708 "data_offset": 0, 00:16:43.708 "data_size": 65536 00:16:43.708 }, 00:16:43.708 { 00:16:43.708 "name": "BaseBdev2", 00:16:43.708 "uuid": "6feabcb7-47cd-4d2e-af00-5f50af2e15ed", 00:16:43.708 "is_configured": true, 00:16:43.708 "data_offset": 0, 00:16:43.708 "data_size": 65536 00:16:43.708 }, 00:16:43.708 { 00:16:43.708 "name": "BaseBdev3", 00:16:43.708 "uuid": "120bc73d-671b-4ce6-b10e-645e8207802e", 00:16:43.708 "is_configured": true, 00:16:43.708 "data_offset": 0, 00:16:43.708 "data_size": 65536 00:16:43.708 } 00:16:43.708 ] 00:16:43.708 }' 00:16:43.708 22:45:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.708 22:45:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:44.646 [2024-07-15 22:45:29.354474] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:44.646 "name": "Existed_Raid", 00:16:44.646 "aliases": [ 00:16:44.646 "15137a4d-70e7-4f91-a4ae-1349079f430b" 00:16:44.646 ], 00:16:44.646 "product_name": "Raid Volume", 00:16:44.646 "block_size": 512, 00:16:44.646 "num_blocks": 65536, 00:16:44.646 "uuid": "15137a4d-70e7-4f91-a4ae-1349079f430b", 00:16:44.646 "assigned_rate_limits": { 00:16:44.646 "rw_ios_per_sec": 0, 00:16:44.646 "rw_mbytes_per_sec": 0, 00:16:44.646 "r_mbytes_per_sec": 0, 00:16:44.646 "w_mbytes_per_sec": 0 00:16:44.646 }, 00:16:44.646 "claimed": false, 00:16:44.646 "zoned": false, 00:16:44.646 "supported_io_types": { 00:16:44.646 "read": true, 00:16:44.646 "write": true, 00:16:44.646 "unmap": false, 00:16:44.646 "flush": false, 00:16:44.646 "reset": true, 00:16:44.646 "nvme_admin": false, 00:16:44.646 "nvme_io": false, 00:16:44.646 "nvme_io_md": false, 00:16:44.646 "write_zeroes": true, 00:16:44.646 "zcopy": false, 00:16:44.646 "get_zone_info": false, 00:16:44.646 "zone_management": false, 00:16:44.646 "zone_append": false, 00:16:44.646 "compare": false, 00:16:44.646 "compare_and_write": false, 00:16:44.646 "abort": false, 00:16:44.646 "seek_hole": false, 00:16:44.646 "seek_data": false, 00:16:44.646 "copy": false, 00:16:44.646 "nvme_iov_md": false 00:16:44.646 }, 00:16:44.646 "memory_domains": [ 00:16:44.646 { 00:16:44.646 "dma_device_id": "system", 00:16:44.646 "dma_device_type": 1 00:16:44.646 }, 00:16:44.646 { 00:16:44.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.646 "dma_device_type": 2 00:16:44.646 }, 00:16:44.646 { 00:16:44.646 "dma_device_id": "system", 00:16:44.646 "dma_device_type": 1 00:16:44.646 }, 00:16:44.646 { 00:16:44.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.646 "dma_device_type": 2 00:16:44.646 }, 00:16:44.646 { 00:16:44.646 "dma_device_id": "system", 00:16:44.646 "dma_device_type": 1 00:16:44.646 }, 00:16:44.646 { 00:16:44.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.646 "dma_device_type": 2 00:16:44.646 } 00:16:44.646 ], 00:16:44.646 "driver_specific": { 00:16:44.646 "raid": { 00:16:44.646 "uuid": "15137a4d-70e7-4f91-a4ae-1349079f430b", 00:16:44.646 "strip_size_kb": 0, 00:16:44.646 "state": "online", 00:16:44.646 "raid_level": "raid1", 00:16:44.646 "superblock": false, 00:16:44.646 "num_base_bdevs": 3, 00:16:44.646 "num_base_bdevs_discovered": 3, 00:16:44.646 "num_base_bdevs_operational": 3, 00:16:44.646 "base_bdevs_list": [ 00:16:44.646 { 00:16:44.646 "name": "BaseBdev1", 00:16:44.646 "uuid": "616c2bf3-3e63-4fb3-bbad-290e57d008f2", 00:16:44.646 "is_configured": true, 00:16:44.646 "data_offset": 0, 00:16:44.646 "data_size": 65536 00:16:44.646 }, 00:16:44.646 { 00:16:44.646 "name": "BaseBdev2", 00:16:44.646 "uuid": "6feabcb7-47cd-4d2e-af00-5f50af2e15ed", 00:16:44.646 "is_configured": true, 00:16:44.646 "data_offset": 0, 00:16:44.646 "data_size": 65536 00:16:44.646 }, 00:16:44.646 { 00:16:44.646 "name": "BaseBdev3", 00:16:44.646 "uuid": "120bc73d-671b-4ce6-b10e-645e8207802e", 00:16:44.646 "is_configured": true, 00:16:44.646 "data_offset": 0, 00:16:44.646 "data_size": 65536 00:16:44.646 } 00:16:44.646 ] 00:16:44.646 } 00:16:44.646 } 00:16:44.646 }' 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:44.646 BaseBdev2 00:16:44.646 BaseBdev3' 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:44.646 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:44.906 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:44.906 "name": "BaseBdev1", 00:16:44.906 "aliases": [ 00:16:44.907 "616c2bf3-3e63-4fb3-bbad-290e57d008f2" 00:16:44.907 ], 00:16:44.907 "product_name": "Malloc disk", 00:16:44.907 "block_size": 512, 00:16:44.907 "num_blocks": 65536, 00:16:44.907 "uuid": "616c2bf3-3e63-4fb3-bbad-290e57d008f2", 00:16:44.907 "assigned_rate_limits": { 00:16:44.907 "rw_ios_per_sec": 0, 00:16:44.907 "rw_mbytes_per_sec": 0, 00:16:44.907 "r_mbytes_per_sec": 0, 00:16:44.907 "w_mbytes_per_sec": 0 00:16:44.907 }, 00:16:44.907 "claimed": true, 00:16:44.907 "claim_type": "exclusive_write", 00:16:44.907 "zoned": false, 00:16:44.907 "supported_io_types": { 00:16:44.907 "read": true, 00:16:44.907 "write": true, 00:16:44.907 "unmap": true, 00:16:44.907 "flush": true, 00:16:44.907 "reset": true, 00:16:44.907 "nvme_admin": false, 00:16:44.907 "nvme_io": false, 00:16:44.907 "nvme_io_md": false, 00:16:44.907 "write_zeroes": true, 00:16:44.907 "zcopy": true, 00:16:44.907 "get_zone_info": false, 00:16:44.907 "zone_management": false, 00:16:44.907 "zone_append": false, 00:16:44.907 "compare": false, 00:16:44.907 "compare_and_write": false, 00:16:44.907 "abort": true, 00:16:44.907 "seek_hole": false, 00:16:44.907 "seek_data": false, 00:16:44.907 "copy": true, 00:16:44.907 "nvme_iov_md": false 00:16:44.907 }, 00:16:44.907 "memory_domains": [ 00:16:44.907 { 00:16:44.907 "dma_device_id": "system", 00:16:44.907 "dma_device_type": 1 00:16:44.907 }, 00:16:44.907 { 00:16:44.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.907 "dma_device_type": 2 00:16:44.907 } 00:16:44.907 ], 00:16:44.907 "driver_specific": {} 00:16:44.907 }' 00:16:44.907 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.907 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.907 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:44.907 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.166 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.166 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:45.166 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.166 22:45:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.166 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.166 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.424 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.424 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.424 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:45.424 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:45.424 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:45.682 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:45.682 "name": "BaseBdev2", 00:16:45.682 "aliases": [ 00:16:45.683 "6feabcb7-47cd-4d2e-af00-5f50af2e15ed" 00:16:45.683 ], 00:16:45.683 "product_name": "Malloc disk", 00:16:45.683 "block_size": 512, 00:16:45.683 "num_blocks": 65536, 00:16:45.683 "uuid": "6feabcb7-47cd-4d2e-af00-5f50af2e15ed", 00:16:45.683 "assigned_rate_limits": { 00:16:45.683 "rw_ios_per_sec": 0, 00:16:45.683 "rw_mbytes_per_sec": 0, 00:16:45.683 "r_mbytes_per_sec": 0, 00:16:45.683 "w_mbytes_per_sec": 0 00:16:45.683 }, 00:16:45.683 "claimed": true, 00:16:45.683 "claim_type": "exclusive_write", 00:16:45.683 "zoned": false, 00:16:45.683 "supported_io_types": { 00:16:45.683 "read": true, 00:16:45.683 "write": true, 00:16:45.683 "unmap": true, 00:16:45.683 "flush": true, 00:16:45.683 "reset": true, 00:16:45.683 "nvme_admin": false, 00:16:45.683 "nvme_io": false, 00:16:45.683 "nvme_io_md": false, 00:16:45.683 "write_zeroes": true, 00:16:45.683 "zcopy": true, 00:16:45.683 "get_zone_info": false, 00:16:45.683 "zone_management": false, 00:16:45.683 "zone_append": false, 00:16:45.683 "compare": false, 00:16:45.683 "compare_and_write": false, 00:16:45.683 "abort": true, 00:16:45.683 "seek_hole": false, 00:16:45.683 "seek_data": false, 00:16:45.683 "copy": true, 00:16:45.683 "nvme_iov_md": false 00:16:45.683 }, 00:16:45.683 "memory_domains": [ 00:16:45.683 { 00:16:45.683 "dma_device_id": "system", 00:16:45.683 "dma_device_type": 1 00:16:45.683 }, 00:16:45.683 { 00:16:45.683 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:45.683 "dma_device_type": 2 00:16:45.683 } 00:16:45.683 ], 00:16:45.683 "driver_specific": {} 00:16:45.683 }' 00:16:45.683 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.683 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:45.683 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:45.683 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.683 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.683 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:45.683 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:45.942 22:45:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:46.201 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:46.201 "name": "BaseBdev3", 00:16:46.201 "aliases": [ 00:16:46.201 "120bc73d-671b-4ce6-b10e-645e8207802e" 00:16:46.201 ], 00:16:46.201 "product_name": "Malloc disk", 00:16:46.201 "block_size": 512, 00:16:46.201 "num_blocks": 65536, 00:16:46.201 "uuid": "120bc73d-671b-4ce6-b10e-645e8207802e", 00:16:46.201 "assigned_rate_limits": { 00:16:46.201 "rw_ios_per_sec": 0, 00:16:46.201 "rw_mbytes_per_sec": 0, 00:16:46.201 "r_mbytes_per_sec": 0, 00:16:46.201 "w_mbytes_per_sec": 0 00:16:46.201 }, 00:16:46.201 "claimed": true, 00:16:46.201 "claim_type": "exclusive_write", 00:16:46.201 "zoned": false, 00:16:46.201 "supported_io_types": { 00:16:46.201 "read": true, 00:16:46.201 "write": true, 00:16:46.201 "unmap": true, 00:16:46.201 "flush": true, 00:16:46.201 "reset": true, 00:16:46.201 "nvme_admin": false, 00:16:46.201 "nvme_io": false, 00:16:46.201 "nvme_io_md": false, 00:16:46.201 "write_zeroes": true, 00:16:46.201 "zcopy": true, 00:16:46.201 "get_zone_info": false, 00:16:46.201 "zone_management": false, 00:16:46.201 "zone_append": false, 00:16:46.201 "compare": false, 00:16:46.201 "compare_and_write": false, 00:16:46.201 "abort": true, 00:16:46.201 "seek_hole": false, 00:16:46.201 "seek_data": false, 00:16:46.201 "copy": true, 00:16:46.201 "nvme_iov_md": false 00:16:46.201 }, 00:16:46.201 "memory_domains": [ 00:16:46.201 { 00:16:46.201 "dma_device_id": "system", 00:16:46.201 "dma_device_type": 1 00:16:46.201 }, 00:16:46.201 { 00:16:46.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:46.201 "dma_device_type": 2 00:16:46.201 } 00:16:46.201 ], 00:16:46.201 "driver_specific": {} 00:16:46.201 }' 00:16:46.201 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:46.460 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:46.460 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:46.460 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:46.460 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:46.460 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:46.460 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:46.718 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:46.718 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:46.718 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:46.718 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:46.718 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:46.718 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:46.977 [2024-07-15 22:45:31.836835] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.977 22:45:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.545 22:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.545 "name": "Existed_Raid", 00:16:47.545 "uuid": "15137a4d-70e7-4f91-a4ae-1349079f430b", 00:16:47.545 "strip_size_kb": 0, 00:16:47.545 "state": "online", 00:16:47.545 "raid_level": "raid1", 00:16:47.545 "superblock": false, 00:16:47.545 "num_base_bdevs": 3, 00:16:47.545 "num_base_bdevs_discovered": 2, 00:16:47.545 "num_base_bdevs_operational": 2, 00:16:47.545 "base_bdevs_list": [ 00:16:47.545 { 00:16:47.545 "name": null, 00:16:47.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:47.545 "is_configured": false, 00:16:47.545 "data_offset": 0, 00:16:47.545 "data_size": 65536 00:16:47.545 }, 00:16:47.545 { 00:16:47.545 "name": "BaseBdev2", 00:16:47.545 "uuid": "6feabcb7-47cd-4d2e-af00-5f50af2e15ed", 00:16:47.545 "is_configured": true, 00:16:47.545 "data_offset": 0, 00:16:47.545 "data_size": 65536 00:16:47.545 }, 00:16:47.545 { 00:16:47.545 "name": "BaseBdev3", 00:16:47.545 "uuid": "120bc73d-671b-4ce6-b10e-645e8207802e", 00:16:47.545 "is_configured": true, 00:16:47.545 "data_offset": 0, 00:16:47.545 "data_size": 65536 00:16:47.545 } 00:16:47.545 ] 00:16:47.545 }' 00:16:47.545 22:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.545 22:45:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.113 22:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:48.113 22:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:48.113 22:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.113 22:45:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:48.372 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:48.372 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:48.372 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:48.631 [2024-07-15 22:45:33.450180] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:48.631 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:48.631 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:48.631 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.631 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:48.890 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:48.890 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:48.890 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:49.148 [2024-07-15 22:45:33.967901] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:49.148 [2024-07-15 22:45:33.968003] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:49.148 [2024-07-15 22:45:33.980299] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:49.148 [2024-07-15 22:45:33.980335] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:49.148 [2024-07-15 22:45:33.980347] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15db400 name Existed_Raid, state offline 00:16:49.148 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:49.148 22:45:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:49.148 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.148 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:49.406 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:49.406 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:49.406 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:49.406 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:49.406 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:49.406 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:49.664 BaseBdev2 00:16:49.664 22:45:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:49.664 22:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:49.664 22:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:49.664 22:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:49.664 22:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:49.664 22:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:49.664 22:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:49.923 22:45:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:50.181 [ 00:16:50.181 { 00:16:50.181 "name": "BaseBdev2", 00:16:50.181 "aliases": [ 00:16:50.181 "17df851a-9122-4343-9b2a-66bd91d4a284" 00:16:50.181 ], 00:16:50.181 "product_name": "Malloc disk", 00:16:50.181 "block_size": 512, 00:16:50.181 "num_blocks": 65536, 00:16:50.181 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:16:50.181 "assigned_rate_limits": { 00:16:50.181 "rw_ios_per_sec": 0, 00:16:50.181 "rw_mbytes_per_sec": 0, 00:16:50.181 "r_mbytes_per_sec": 0, 00:16:50.181 "w_mbytes_per_sec": 0 00:16:50.181 }, 00:16:50.181 "claimed": false, 00:16:50.181 "zoned": false, 00:16:50.182 "supported_io_types": { 00:16:50.182 "read": true, 00:16:50.182 "write": true, 00:16:50.182 "unmap": true, 00:16:50.182 "flush": true, 00:16:50.182 "reset": true, 00:16:50.182 "nvme_admin": false, 00:16:50.182 "nvme_io": false, 00:16:50.182 "nvme_io_md": false, 00:16:50.182 "write_zeroes": true, 00:16:50.182 "zcopy": true, 00:16:50.182 "get_zone_info": false, 00:16:50.182 "zone_management": false, 00:16:50.182 "zone_append": false, 00:16:50.182 "compare": false, 00:16:50.182 "compare_and_write": false, 00:16:50.182 "abort": true, 00:16:50.182 "seek_hole": false, 00:16:50.182 "seek_data": false, 00:16:50.182 "copy": true, 00:16:50.182 "nvme_iov_md": false 00:16:50.182 }, 00:16:50.182 "memory_domains": [ 00:16:50.182 { 00:16:50.182 "dma_device_id": "system", 00:16:50.182 "dma_device_type": 1 00:16:50.182 }, 00:16:50.182 { 00:16:50.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.182 "dma_device_type": 2 00:16:50.182 } 00:16:50.182 ], 00:16:50.182 "driver_specific": {} 00:16:50.182 } 00:16:50.182 ] 00:16:50.182 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:50.182 22:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:50.182 22:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:50.182 22:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:50.441 BaseBdev3 00:16:50.441 22:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:50.441 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:50.441 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:50.441 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:50.441 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:50.441 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:50.441 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:50.699 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:50.958 [ 00:16:50.958 { 00:16:50.958 "name": "BaseBdev3", 00:16:50.958 "aliases": [ 00:16:50.958 "2749260c-280a-4add-837a-d896058edf71" 00:16:50.958 ], 00:16:50.958 "product_name": "Malloc disk", 00:16:50.958 "block_size": 512, 00:16:50.958 "num_blocks": 65536, 00:16:50.958 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:16:50.958 "assigned_rate_limits": { 00:16:50.958 "rw_ios_per_sec": 0, 00:16:50.958 "rw_mbytes_per_sec": 0, 00:16:50.958 "r_mbytes_per_sec": 0, 00:16:50.958 "w_mbytes_per_sec": 0 00:16:50.958 }, 00:16:50.958 "claimed": false, 00:16:50.958 "zoned": false, 00:16:50.958 "supported_io_types": { 00:16:50.958 "read": true, 00:16:50.958 "write": true, 00:16:50.958 "unmap": true, 00:16:50.958 "flush": true, 00:16:50.958 "reset": true, 00:16:50.958 "nvme_admin": false, 00:16:50.958 "nvme_io": false, 00:16:50.958 "nvme_io_md": false, 00:16:50.958 "write_zeroes": true, 00:16:50.958 "zcopy": true, 00:16:50.958 "get_zone_info": false, 00:16:50.958 "zone_management": false, 00:16:50.958 "zone_append": false, 00:16:50.958 "compare": false, 00:16:50.958 "compare_and_write": false, 00:16:50.958 "abort": true, 00:16:50.958 "seek_hole": false, 00:16:50.958 "seek_data": false, 00:16:50.958 "copy": true, 00:16:50.958 "nvme_iov_md": false 00:16:50.958 }, 00:16:50.958 "memory_domains": [ 00:16:50.958 { 00:16:50.958 "dma_device_id": "system", 00:16:50.958 "dma_device_type": 1 00:16:50.958 }, 00:16:50.958 { 00:16:50.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.958 "dma_device_type": 2 00:16:50.958 } 00:16:50.958 ], 00:16:50.958 "driver_specific": {} 00:16:50.958 } 00:16:50.958 ] 00:16:50.958 22:45:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:50.958 22:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:50.958 22:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:50.958 22:45:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:51.217 [2024-07-15 22:45:36.063595] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:51.217 [2024-07-15 22:45:36.063640] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:51.217 [2024-07-15 22:45:36.063662] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:51.217 [2024-07-15 22:45:36.065016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.217 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:51.857 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.857 "name": "Existed_Raid", 00:16:51.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:51.857 "strip_size_kb": 0, 00:16:51.857 "state": "configuring", 00:16:51.857 "raid_level": "raid1", 00:16:51.857 "superblock": false, 00:16:51.857 "num_base_bdevs": 3, 00:16:51.857 "num_base_bdevs_discovered": 2, 00:16:51.857 "num_base_bdevs_operational": 3, 00:16:51.857 "base_bdevs_list": [ 00:16:51.857 { 00:16:51.857 "name": "BaseBdev1", 00:16:51.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:51.857 "is_configured": false, 00:16:51.857 "data_offset": 0, 00:16:51.857 "data_size": 0 00:16:51.857 }, 00:16:51.857 { 00:16:51.857 "name": "BaseBdev2", 00:16:51.857 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:16:51.857 "is_configured": true, 00:16:51.857 "data_offset": 0, 00:16:51.857 "data_size": 65536 00:16:51.857 }, 00:16:51.857 { 00:16:51.857 "name": "BaseBdev3", 00:16:51.857 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:16:51.857 "is_configured": true, 00:16:51.857 "data_offset": 0, 00:16:51.857 "data_size": 65536 00:16:51.857 } 00:16:51.857 ] 00:16:51.857 }' 00:16:51.857 22:45:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.857 22:45:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:52.794 [2024-07-15 22:45:37.627755] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.794 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.052 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.052 "name": "Existed_Raid", 00:16:53.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.052 "strip_size_kb": 0, 00:16:53.052 "state": "configuring", 00:16:53.052 "raid_level": "raid1", 00:16:53.052 "superblock": false, 00:16:53.052 "num_base_bdevs": 3, 00:16:53.052 "num_base_bdevs_discovered": 1, 00:16:53.052 "num_base_bdevs_operational": 3, 00:16:53.052 "base_bdevs_list": [ 00:16:53.052 { 00:16:53.052 "name": "BaseBdev1", 00:16:53.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.052 "is_configured": false, 00:16:53.052 "data_offset": 0, 00:16:53.052 "data_size": 0 00:16:53.052 }, 00:16:53.052 { 00:16:53.052 "name": null, 00:16:53.052 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:16:53.052 "is_configured": false, 00:16:53.052 "data_offset": 0, 00:16:53.052 "data_size": 65536 00:16:53.052 }, 00:16:53.052 { 00:16:53.052 "name": "BaseBdev3", 00:16:53.052 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:16:53.052 "is_configured": true, 00:16:53.052 "data_offset": 0, 00:16:53.052 "data_size": 65536 00:16:53.052 } 00:16:53.052 ] 00:16:53.052 }' 00:16:53.052 22:45:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.052 22:45:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:53.620 22:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:53.620 22:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.879 22:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:53.879 22:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:54.138 [2024-07-15 22:45:38.850444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:54.138 BaseBdev1 00:16:54.138 22:45:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:54.138 22:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:54.138 22:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:54.138 22:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:54.138 22:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:54.138 22:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:54.138 22:45:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:54.397 22:45:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:54.656 [ 00:16:54.656 { 00:16:54.656 "name": "BaseBdev1", 00:16:54.656 "aliases": [ 00:16:54.656 "cdd9bdab-0a77-4cf2-9459-2e7705dff513" 00:16:54.656 ], 00:16:54.656 "product_name": "Malloc disk", 00:16:54.656 "block_size": 512, 00:16:54.656 "num_blocks": 65536, 00:16:54.656 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:16:54.656 "assigned_rate_limits": { 00:16:54.656 "rw_ios_per_sec": 0, 00:16:54.656 "rw_mbytes_per_sec": 0, 00:16:54.656 "r_mbytes_per_sec": 0, 00:16:54.656 "w_mbytes_per_sec": 0 00:16:54.656 }, 00:16:54.656 "claimed": true, 00:16:54.656 "claim_type": "exclusive_write", 00:16:54.656 "zoned": false, 00:16:54.656 "supported_io_types": { 00:16:54.656 "read": true, 00:16:54.656 "write": true, 00:16:54.656 "unmap": true, 00:16:54.656 "flush": true, 00:16:54.656 "reset": true, 00:16:54.656 "nvme_admin": false, 00:16:54.656 "nvme_io": false, 00:16:54.656 "nvme_io_md": false, 00:16:54.657 "write_zeroes": true, 00:16:54.657 "zcopy": true, 00:16:54.657 "get_zone_info": false, 00:16:54.657 "zone_management": false, 00:16:54.657 "zone_append": false, 00:16:54.657 "compare": false, 00:16:54.657 "compare_and_write": false, 00:16:54.657 "abort": true, 00:16:54.657 "seek_hole": false, 00:16:54.657 "seek_data": false, 00:16:54.657 "copy": true, 00:16:54.657 "nvme_iov_md": false 00:16:54.657 }, 00:16:54.657 "memory_domains": [ 00:16:54.657 { 00:16:54.657 "dma_device_id": "system", 00:16:54.657 "dma_device_type": 1 00:16:54.657 }, 00:16:54.657 { 00:16:54.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.657 "dma_device_type": 2 00:16:54.657 } 00:16:54.657 ], 00:16:54.657 "driver_specific": {} 00:16:54.657 } 00:16:54.657 ] 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.657 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.916 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.916 "name": "Existed_Raid", 00:16:54.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.916 "strip_size_kb": 0, 00:16:54.916 "state": "configuring", 00:16:54.916 "raid_level": "raid1", 00:16:54.916 "superblock": false, 00:16:54.916 "num_base_bdevs": 3, 00:16:54.916 "num_base_bdevs_discovered": 2, 00:16:54.916 "num_base_bdevs_operational": 3, 00:16:54.916 "base_bdevs_list": [ 00:16:54.916 { 00:16:54.916 "name": "BaseBdev1", 00:16:54.916 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:16:54.916 "is_configured": true, 00:16:54.916 "data_offset": 0, 00:16:54.916 "data_size": 65536 00:16:54.916 }, 00:16:54.916 { 00:16:54.916 "name": null, 00:16:54.916 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:16:54.916 "is_configured": false, 00:16:54.916 "data_offset": 0, 00:16:54.916 "data_size": 65536 00:16:54.916 }, 00:16:54.916 { 00:16:54.916 "name": "BaseBdev3", 00:16:54.916 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:16:54.916 "is_configured": true, 00:16:54.916 "data_offset": 0, 00:16:54.916 "data_size": 65536 00:16:54.916 } 00:16:54.916 ] 00:16:54.916 }' 00:16:54.916 22:45:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.916 22:45:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.851 22:45:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.851 22:45:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:56.110 22:45:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:56.110 22:45:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:56.369 [2024-07-15 22:45:41.052395] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.369 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.627 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:56.627 "name": "Existed_Raid", 00:16:56.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.627 "strip_size_kb": 0, 00:16:56.627 "state": "configuring", 00:16:56.627 "raid_level": "raid1", 00:16:56.627 "superblock": false, 00:16:56.627 "num_base_bdevs": 3, 00:16:56.627 "num_base_bdevs_discovered": 1, 00:16:56.627 "num_base_bdevs_operational": 3, 00:16:56.627 "base_bdevs_list": [ 00:16:56.627 { 00:16:56.627 "name": "BaseBdev1", 00:16:56.627 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:16:56.627 "is_configured": true, 00:16:56.627 "data_offset": 0, 00:16:56.627 "data_size": 65536 00:16:56.627 }, 00:16:56.627 { 00:16:56.627 "name": null, 00:16:56.627 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:16:56.627 "is_configured": false, 00:16:56.627 "data_offset": 0, 00:16:56.627 "data_size": 65536 00:16:56.627 }, 00:16:56.627 { 00:16:56.627 "name": null, 00:16:56.627 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:16:56.628 "is_configured": false, 00:16:56.628 "data_offset": 0, 00:16:56.628 "data_size": 65536 00:16:56.628 } 00:16:56.628 ] 00:16:56.628 }' 00:16:56.628 22:45:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:56.628 22:45:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:57.562 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:57.562 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:57.820 [2024-07-15 22:45:42.628743] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.820 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.078 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.078 "name": "Existed_Raid", 00:16:58.078 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.078 "strip_size_kb": 0, 00:16:58.078 "state": "configuring", 00:16:58.078 "raid_level": "raid1", 00:16:58.078 "superblock": false, 00:16:58.078 "num_base_bdevs": 3, 00:16:58.078 "num_base_bdevs_discovered": 2, 00:16:58.078 "num_base_bdevs_operational": 3, 00:16:58.078 "base_bdevs_list": [ 00:16:58.078 { 00:16:58.078 "name": "BaseBdev1", 00:16:58.078 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:16:58.078 "is_configured": true, 00:16:58.078 "data_offset": 0, 00:16:58.078 "data_size": 65536 00:16:58.078 }, 00:16:58.078 { 00:16:58.078 "name": null, 00:16:58.078 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:16:58.078 "is_configured": false, 00:16:58.078 "data_offset": 0, 00:16:58.078 "data_size": 65536 00:16:58.078 }, 00:16:58.078 { 00:16:58.078 "name": "BaseBdev3", 00:16:58.078 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:16:58.078 "is_configured": true, 00:16:58.078 "data_offset": 0, 00:16:58.078 "data_size": 65536 00:16:58.078 } 00:16:58.078 ] 00:16:58.078 }' 00:16:58.078 22:45:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.078 22:45:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.644 22:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.644 22:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:58.903 22:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:58.903 22:45:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:59.472 [2024-07-15 22:45:44.253056] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.472 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:00.039 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.039 "name": "Existed_Raid", 00:17:00.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:00.039 "strip_size_kb": 0, 00:17:00.039 "state": "configuring", 00:17:00.039 "raid_level": "raid1", 00:17:00.040 "superblock": false, 00:17:00.040 "num_base_bdevs": 3, 00:17:00.040 "num_base_bdevs_discovered": 1, 00:17:00.040 "num_base_bdevs_operational": 3, 00:17:00.040 "base_bdevs_list": [ 00:17:00.040 { 00:17:00.040 "name": null, 00:17:00.040 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:17:00.040 "is_configured": false, 00:17:00.040 "data_offset": 0, 00:17:00.040 "data_size": 65536 00:17:00.040 }, 00:17:00.040 { 00:17:00.040 "name": null, 00:17:00.040 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:17:00.040 "is_configured": false, 00:17:00.040 "data_offset": 0, 00:17:00.040 "data_size": 65536 00:17:00.040 }, 00:17:00.040 { 00:17:00.040 "name": "BaseBdev3", 00:17:00.040 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:17:00.040 "is_configured": true, 00:17:00.040 "data_offset": 0, 00:17:00.040 "data_size": 65536 00:17:00.040 } 00:17:00.040 ] 00:17:00.040 }' 00:17:00.040 22:45:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.040 22:45:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.972 22:45:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.972 22:45:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:00.972 22:45:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:00.972 22:45:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:01.537 [2024-07-15 22:45:46.349123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:01.537 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:01.537 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.538 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.105 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.105 "name": "Existed_Raid", 00:17:02.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.105 "strip_size_kb": 0, 00:17:02.105 "state": "configuring", 00:17:02.105 "raid_level": "raid1", 00:17:02.105 "superblock": false, 00:17:02.105 "num_base_bdevs": 3, 00:17:02.105 "num_base_bdevs_discovered": 2, 00:17:02.105 "num_base_bdevs_operational": 3, 00:17:02.105 "base_bdevs_list": [ 00:17:02.105 { 00:17:02.105 "name": null, 00:17:02.105 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:17:02.105 "is_configured": false, 00:17:02.105 "data_offset": 0, 00:17:02.105 "data_size": 65536 00:17:02.105 }, 00:17:02.105 { 00:17:02.105 "name": "BaseBdev2", 00:17:02.105 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:17:02.105 "is_configured": true, 00:17:02.105 "data_offset": 0, 00:17:02.105 "data_size": 65536 00:17:02.105 }, 00:17:02.105 { 00:17:02.105 "name": "BaseBdev3", 00:17:02.105 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:17:02.105 "is_configured": true, 00:17:02.105 "data_offset": 0, 00:17:02.105 "data_size": 65536 00:17:02.105 } 00:17:02.105 ] 00:17:02.105 }' 00:17:02.105 22:45:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.105 22:45:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:02.672 22:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.672 22:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:02.931 22:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:02.931 22:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.931 22:45:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:03.190 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cdd9bdab-0a77-4cf2-9459-2e7705dff513 00:17:03.449 [2024-07-15 22:45:48.249560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:03.449 [2024-07-15 22:45:48.249603] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15dee40 00:17:03.449 [2024-07-15 22:45:48.249613] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:03.449 [2024-07-15 22:45:48.249805] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15dbe60 00:17:03.449 [2024-07-15 22:45:48.249939] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15dee40 00:17:03.449 [2024-07-15 22:45:48.249950] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x15dee40 00:17:03.449 [2024-07-15 22:45:48.250127] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:03.449 NewBaseBdev 00:17:03.449 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:03.449 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:03.449 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:03.449 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:03.449 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:03.449 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:03.449 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:03.708 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:03.966 [ 00:17:03.966 { 00:17:03.966 "name": "NewBaseBdev", 00:17:03.966 "aliases": [ 00:17:03.966 "cdd9bdab-0a77-4cf2-9459-2e7705dff513" 00:17:03.966 ], 00:17:03.966 "product_name": "Malloc disk", 00:17:03.966 "block_size": 512, 00:17:03.966 "num_blocks": 65536, 00:17:03.966 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:17:03.966 "assigned_rate_limits": { 00:17:03.966 "rw_ios_per_sec": 0, 00:17:03.966 "rw_mbytes_per_sec": 0, 00:17:03.966 "r_mbytes_per_sec": 0, 00:17:03.966 "w_mbytes_per_sec": 0 00:17:03.966 }, 00:17:03.966 "claimed": true, 00:17:03.966 "claim_type": "exclusive_write", 00:17:03.966 "zoned": false, 00:17:03.966 "supported_io_types": { 00:17:03.966 "read": true, 00:17:03.966 "write": true, 00:17:03.966 "unmap": true, 00:17:03.966 "flush": true, 00:17:03.966 "reset": true, 00:17:03.966 "nvme_admin": false, 00:17:03.966 "nvme_io": false, 00:17:03.966 "nvme_io_md": false, 00:17:03.966 "write_zeroes": true, 00:17:03.966 "zcopy": true, 00:17:03.966 "get_zone_info": false, 00:17:03.966 "zone_management": false, 00:17:03.966 "zone_append": false, 00:17:03.966 "compare": false, 00:17:03.966 "compare_and_write": false, 00:17:03.966 "abort": true, 00:17:03.966 "seek_hole": false, 00:17:03.966 "seek_data": false, 00:17:03.966 "copy": true, 00:17:03.966 "nvme_iov_md": false 00:17:03.966 }, 00:17:03.966 "memory_domains": [ 00:17:03.966 { 00:17:03.966 "dma_device_id": "system", 00:17:03.966 "dma_device_type": 1 00:17:03.966 }, 00:17:03.966 { 00:17:03.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.966 "dma_device_type": 2 00:17:03.966 } 00:17:03.966 ], 00:17:03.966 "driver_specific": {} 00:17:03.966 } 00:17:03.966 ] 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.966 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.224 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.224 "name": "Existed_Raid", 00:17:04.224 "uuid": "92589eaa-1ddd-48c0-b7cb-51ca1855c8ae", 00:17:04.224 "strip_size_kb": 0, 00:17:04.224 "state": "online", 00:17:04.224 "raid_level": "raid1", 00:17:04.224 "superblock": false, 00:17:04.224 "num_base_bdevs": 3, 00:17:04.224 "num_base_bdevs_discovered": 3, 00:17:04.224 "num_base_bdevs_operational": 3, 00:17:04.224 "base_bdevs_list": [ 00:17:04.224 { 00:17:04.224 "name": "NewBaseBdev", 00:17:04.224 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:17:04.224 "is_configured": true, 00:17:04.224 "data_offset": 0, 00:17:04.224 "data_size": 65536 00:17:04.224 }, 00:17:04.224 { 00:17:04.224 "name": "BaseBdev2", 00:17:04.224 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:17:04.224 "is_configured": true, 00:17:04.224 "data_offset": 0, 00:17:04.224 "data_size": 65536 00:17:04.224 }, 00:17:04.224 { 00:17:04.224 "name": "BaseBdev3", 00:17:04.224 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:17:04.224 "is_configured": true, 00:17:04.224 "data_offset": 0, 00:17:04.224 "data_size": 65536 00:17:04.224 } 00:17:04.224 ] 00:17:04.224 }' 00:17:04.224 22:45:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.224 22:45:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:04.840 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:05.100 [2024-07-15 22:45:49.818072] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:05.100 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:05.100 "name": "Existed_Raid", 00:17:05.100 "aliases": [ 00:17:05.100 "92589eaa-1ddd-48c0-b7cb-51ca1855c8ae" 00:17:05.100 ], 00:17:05.100 "product_name": "Raid Volume", 00:17:05.100 "block_size": 512, 00:17:05.100 "num_blocks": 65536, 00:17:05.100 "uuid": "92589eaa-1ddd-48c0-b7cb-51ca1855c8ae", 00:17:05.100 "assigned_rate_limits": { 00:17:05.100 "rw_ios_per_sec": 0, 00:17:05.100 "rw_mbytes_per_sec": 0, 00:17:05.100 "r_mbytes_per_sec": 0, 00:17:05.100 "w_mbytes_per_sec": 0 00:17:05.100 }, 00:17:05.100 "claimed": false, 00:17:05.100 "zoned": false, 00:17:05.100 "supported_io_types": { 00:17:05.100 "read": true, 00:17:05.100 "write": true, 00:17:05.100 "unmap": false, 00:17:05.100 "flush": false, 00:17:05.100 "reset": true, 00:17:05.100 "nvme_admin": false, 00:17:05.100 "nvme_io": false, 00:17:05.100 "nvme_io_md": false, 00:17:05.100 "write_zeroes": true, 00:17:05.100 "zcopy": false, 00:17:05.100 "get_zone_info": false, 00:17:05.100 "zone_management": false, 00:17:05.100 "zone_append": false, 00:17:05.100 "compare": false, 00:17:05.100 "compare_and_write": false, 00:17:05.100 "abort": false, 00:17:05.100 "seek_hole": false, 00:17:05.100 "seek_data": false, 00:17:05.100 "copy": false, 00:17:05.100 "nvme_iov_md": false 00:17:05.100 }, 00:17:05.100 "memory_domains": [ 00:17:05.100 { 00:17:05.100 "dma_device_id": "system", 00:17:05.100 "dma_device_type": 1 00:17:05.100 }, 00:17:05.100 { 00:17:05.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.100 "dma_device_type": 2 00:17:05.100 }, 00:17:05.100 { 00:17:05.100 "dma_device_id": "system", 00:17:05.100 "dma_device_type": 1 00:17:05.100 }, 00:17:05.100 { 00:17:05.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.100 "dma_device_type": 2 00:17:05.100 }, 00:17:05.100 { 00:17:05.100 "dma_device_id": "system", 00:17:05.100 "dma_device_type": 1 00:17:05.100 }, 00:17:05.100 { 00:17:05.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.100 "dma_device_type": 2 00:17:05.100 } 00:17:05.100 ], 00:17:05.100 "driver_specific": { 00:17:05.100 "raid": { 00:17:05.100 "uuid": "92589eaa-1ddd-48c0-b7cb-51ca1855c8ae", 00:17:05.100 "strip_size_kb": 0, 00:17:05.100 "state": "online", 00:17:05.100 "raid_level": "raid1", 00:17:05.100 "superblock": false, 00:17:05.100 "num_base_bdevs": 3, 00:17:05.100 "num_base_bdevs_discovered": 3, 00:17:05.100 "num_base_bdevs_operational": 3, 00:17:05.100 "base_bdevs_list": [ 00:17:05.100 { 00:17:05.100 "name": "NewBaseBdev", 00:17:05.100 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:17:05.100 "is_configured": true, 00:17:05.100 "data_offset": 0, 00:17:05.100 "data_size": 65536 00:17:05.100 }, 00:17:05.100 { 00:17:05.100 "name": "BaseBdev2", 00:17:05.100 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:17:05.100 "is_configured": true, 00:17:05.100 "data_offset": 0, 00:17:05.100 "data_size": 65536 00:17:05.100 }, 00:17:05.100 { 00:17:05.100 "name": "BaseBdev3", 00:17:05.100 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:17:05.100 "is_configured": true, 00:17:05.100 "data_offset": 0, 00:17:05.100 "data_size": 65536 00:17:05.100 } 00:17:05.100 ] 00:17:05.100 } 00:17:05.100 } 00:17:05.100 }' 00:17:05.100 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:05.100 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:05.100 BaseBdev2 00:17:05.100 BaseBdev3' 00:17:05.100 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.100 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:05.100 22:45:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.358 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.358 "name": "NewBaseBdev", 00:17:05.358 "aliases": [ 00:17:05.358 "cdd9bdab-0a77-4cf2-9459-2e7705dff513" 00:17:05.358 ], 00:17:05.358 "product_name": "Malloc disk", 00:17:05.358 "block_size": 512, 00:17:05.358 "num_blocks": 65536, 00:17:05.358 "uuid": "cdd9bdab-0a77-4cf2-9459-2e7705dff513", 00:17:05.358 "assigned_rate_limits": { 00:17:05.358 "rw_ios_per_sec": 0, 00:17:05.358 "rw_mbytes_per_sec": 0, 00:17:05.358 "r_mbytes_per_sec": 0, 00:17:05.358 "w_mbytes_per_sec": 0 00:17:05.358 }, 00:17:05.358 "claimed": true, 00:17:05.358 "claim_type": "exclusive_write", 00:17:05.358 "zoned": false, 00:17:05.358 "supported_io_types": { 00:17:05.358 "read": true, 00:17:05.358 "write": true, 00:17:05.358 "unmap": true, 00:17:05.358 "flush": true, 00:17:05.358 "reset": true, 00:17:05.358 "nvme_admin": false, 00:17:05.358 "nvme_io": false, 00:17:05.358 "nvme_io_md": false, 00:17:05.358 "write_zeroes": true, 00:17:05.358 "zcopy": true, 00:17:05.358 "get_zone_info": false, 00:17:05.358 "zone_management": false, 00:17:05.358 "zone_append": false, 00:17:05.358 "compare": false, 00:17:05.358 "compare_and_write": false, 00:17:05.358 "abort": true, 00:17:05.358 "seek_hole": false, 00:17:05.358 "seek_data": false, 00:17:05.358 "copy": true, 00:17:05.358 "nvme_iov_md": false 00:17:05.358 }, 00:17:05.358 "memory_domains": [ 00:17:05.358 { 00:17:05.358 "dma_device_id": "system", 00:17:05.358 "dma_device_type": 1 00:17:05.358 }, 00:17:05.358 { 00:17:05.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.358 "dma_device_type": 2 00:17:05.358 } 00:17:05.358 ], 00:17:05.358 "driver_specific": {} 00:17:05.358 }' 00:17:05.358 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.358 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.358 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:05.358 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:05.615 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.872 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.872 "name": "BaseBdev2", 00:17:05.872 "aliases": [ 00:17:05.872 "17df851a-9122-4343-9b2a-66bd91d4a284" 00:17:05.872 ], 00:17:05.872 "product_name": "Malloc disk", 00:17:05.872 "block_size": 512, 00:17:05.872 "num_blocks": 65536, 00:17:05.872 "uuid": "17df851a-9122-4343-9b2a-66bd91d4a284", 00:17:05.872 "assigned_rate_limits": { 00:17:05.872 "rw_ios_per_sec": 0, 00:17:05.872 "rw_mbytes_per_sec": 0, 00:17:05.872 "r_mbytes_per_sec": 0, 00:17:05.872 "w_mbytes_per_sec": 0 00:17:05.872 }, 00:17:05.872 "claimed": true, 00:17:05.872 "claim_type": "exclusive_write", 00:17:05.872 "zoned": false, 00:17:05.872 "supported_io_types": { 00:17:05.872 "read": true, 00:17:05.872 "write": true, 00:17:05.872 "unmap": true, 00:17:05.872 "flush": true, 00:17:05.872 "reset": true, 00:17:05.872 "nvme_admin": false, 00:17:05.872 "nvme_io": false, 00:17:05.872 "nvme_io_md": false, 00:17:05.872 "write_zeroes": true, 00:17:05.872 "zcopy": true, 00:17:05.872 "get_zone_info": false, 00:17:05.872 "zone_management": false, 00:17:05.872 "zone_append": false, 00:17:05.872 "compare": false, 00:17:05.872 "compare_and_write": false, 00:17:05.872 "abort": true, 00:17:05.872 "seek_hole": false, 00:17:05.872 "seek_data": false, 00:17:05.872 "copy": true, 00:17:05.872 "nvme_iov_md": false 00:17:05.872 }, 00:17:05.872 "memory_domains": [ 00:17:05.872 { 00:17:05.872 "dma_device_id": "system", 00:17:05.872 "dma_device_type": 1 00:17:05.872 }, 00:17:05.872 { 00:17:05.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.872 "dma_device_type": 2 00:17:05.872 } 00:17:05.872 ], 00:17:05.872 "driver_specific": {} 00:17:05.872 }' 00:17:05.872 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.129 22:45:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.129 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.386 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.386 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.386 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:06.386 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.643 "name": "BaseBdev3", 00:17:06.643 "aliases": [ 00:17:06.643 "2749260c-280a-4add-837a-d896058edf71" 00:17:06.643 ], 00:17:06.643 "product_name": "Malloc disk", 00:17:06.643 "block_size": 512, 00:17:06.643 "num_blocks": 65536, 00:17:06.643 "uuid": "2749260c-280a-4add-837a-d896058edf71", 00:17:06.643 "assigned_rate_limits": { 00:17:06.643 "rw_ios_per_sec": 0, 00:17:06.643 "rw_mbytes_per_sec": 0, 00:17:06.643 "r_mbytes_per_sec": 0, 00:17:06.643 "w_mbytes_per_sec": 0 00:17:06.643 }, 00:17:06.643 "claimed": true, 00:17:06.643 "claim_type": "exclusive_write", 00:17:06.643 "zoned": false, 00:17:06.643 "supported_io_types": { 00:17:06.643 "read": true, 00:17:06.643 "write": true, 00:17:06.643 "unmap": true, 00:17:06.643 "flush": true, 00:17:06.643 "reset": true, 00:17:06.643 "nvme_admin": false, 00:17:06.643 "nvme_io": false, 00:17:06.643 "nvme_io_md": false, 00:17:06.643 "write_zeroes": true, 00:17:06.643 "zcopy": true, 00:17:06.643 "get_zone_info": false, 00:17:06.643 "zone_management": false, 00:17:06.643 "zone_append": false, 00:17:06.643 "compare": false, 00:17:06.643 "compare_and_write": false, 00:17:06.643 "abort": true, 00:17:06.643 "seek_hole": false, 00:17:06.643 "seek_data": false, 00:17:06.643 "copy": true, 00:17:06.643 "nvme_iov_md": false 00:17:06.643 }, 00:17:06.643 "memory_domains": [ 00:17:06.643 { 00:17:06.643 "dma_device_id": "system", 00:17:06.643 "dma_device_type": 1 00:17:06.643 }, 00:17:06.643 { 00:17:06.643 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.643 "dma_device_type": 2 00:17:06.643 } 00:17:06.643 ], 00:17:06.643 "driver_specific": {} 00:17:06.643 }' 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.643 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.901 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.901 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.901 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.901 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.901 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:07.159 [2024-07-15 22:45:51.895299] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:07.159 [2024-07-15 22:45:51.895331] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:07.159 [2024-07-15 22:45:51.895393] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:07.159 [2024-07-15 22:45:51.895673] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:07.159 [2024-07-15 22:45:51.895685] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15dee40 name Existed_Raid, state offline 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2745954 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2745954 ']' 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2745954 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2745954 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2745954' 00:17:07.159 killing process with pid 2745954 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2745954 00:17:07.159 [2024-07-15 22:45:51.964280] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:07.159 22:45:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2745954 00:17:07.159 [2024-07-15 22:45:51.994403] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:07.418 00:17:07.418 real 0m31.069s 00:17:07.418 user 0m57.424s 00:17:07.418 sys 0m5.425s 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.418 ************************************ 00:17:07.418 END TEST raid_state_function_test 00:17:07.418 ************************************ 00:17:07.418 22:45:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:07.418 22:45:52 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:17:07.418 22:45:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:07.418 22:45:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:07.418 22:45:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:07.418 ************************************ 00:17:07.418 START TEST raid_state_function_test_sb 00:17:07.418 ************************************ 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2750579 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2750579' 00:17:07.418 Process raid pid: 2750579 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2750579 /var/tmp/spdk-raid.sock 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2750579 ']' 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:07.418 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:07.418 22:45:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:07.676 [2024-07-15 22:45:52.377404] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:17:07.676 [2024-07-15 22:45:52.377463] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:07.676 [2024-07-15 22:45:52.492353] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:07.934 [2024-07-15 22:45:52.599667] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:07.934 [2024-07-15 22:45:52.659548] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:07.934 [2024-07-15 22:45:52.659579] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:08.501 22:45:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:08.501 22:45:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:08.501 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:08.760 [2024-07-15 22:45:53.474193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:08.760 [2024-07-15 22:45:53.474241] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:08.760 [2024-07-15 22:45:53.474252] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:08.760 [2024-07-15 22:45:53.474264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:08.760 [2024-07-15 22:45:53.474273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:08.760 [2024-07-15 22:45:53.474284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.760 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.018 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.018 "name": "Existed_Raid", 00:17:09.018 "uuid": "26a44786-c4d6-485b-b0a8-d17e74a5b8ea", 00:17:09.018 "strip_size_kb": 0, 00:17:09.018 "state": "configuring", 00:17:09.018 "raid_level": "raid1", 00:17:09.018 "superblock": true, 00:17:09.018 "num_base_bdevs": 3, 00:17:09.018 "num_base_bdevs_discovered": 0, 00:17:09.018 "num_base_bdevs_operational": 3, 00:17:09.018 "base_bdevs_list": [ 00:17:09.018 { 00:17:09.018 "name": "BaseBdev1", 00:17:09.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.018 "is_configured": false, 00:17:09.018 "data_offset": 0, 00:17:09.018 "data_size": 0 00:17:09.018 }, 00:17:09.018 { 00:17:09.018 "name": "BaseBdev2", 00:17:09.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.018 "is_configured": false, 00:17:09.018 "data_offset": 0, 00:17:09.018 "data_size": 0 00:17:09.018 }, 00:17:09.018 { 00:17:09.018 "name": "BaseBdev3", 00:17:09.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.018 "is_configured": false, 00:17:09.018 "data_offset": 0, 00:17:09.018 "data_size": 0 00:17:09.018 } 00:17:09.018 ] 00:17:09.018 }' 00:17:09.018 22:45:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.018 22:45:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:09.605 22:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:09.863 [2024-07-15 22:45:54.564918] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:09.863 [2024-07-15 22:45:54.564955] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23eaa80 name Existed_Raid, state configuring 00:17:09.863 22:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:10.122 [2024-07-15 22:45:54.813601] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:10.122 [2024-07-15 22:45:54.813640] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:10.122 [2024-07-15 22:45:54.813649] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:10.122 [2024-07-15 22:45:54.813661] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:10.122 [2024-07-15 22:45:54.813670] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:10.122 [2024-07-15 22:45:54.813681] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:10.122 22:45:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:10.380 [2024-07-15 22:45:55.068077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:10.380 BaseBdev1 00:17:10.380 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:10.380 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:10.380 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:10.380 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:10.380 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:10.380 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:10.380 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.638 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:10.895 [ 00:17:10.895 { 00:17:10.895 "name": "BaseBdev1", 00:17:10.895 "aliases": [ 00:17:10.895 "ba0cbafa-964f-4fa8-af02-ac2d7548b76f" 00:17:10.895 ], 00:17:10.895 "product_name": "Malloc disk", 00:17:10.895 "block_size": 512, 00:17:10.895 "num_blocks": 65536, 00:17:10.895 "uuid": "ba0cbafa-964f-4fa8-af02-ac2d7548b76f", 00:17:10.895 "assigned_rate_limits": { 00:17:10.895 "rw_ios_per_sec": 0, 00:17:10.895 "rw_mbytes_per_sec": 0, 00:17:10.895 "r_mbytes_per_sec": 0, 00:17:10.895 "w_mbytes_per_sec": 0 00:17:10.895 }, 00:17:10.895 "claimed": true, 00:17:10.895 "claim_type": "exclusive_write", 00:17:10.895 "zoned": false, 00:17:10.895 "supported_io_types": { 00:17:10.895 "read": true, 00:17:10.895 "write": true, 00:17:10.895 "unmap": true, 00:17:10.895 "flush": true, 00:17:10.895 "reset": true, 00:17:10.895 "nvme_admin": false, 00:17:10.895 "nvme_io": false, 00:17:10.895 "nvme_io_md": false, 00:17:10.895 "write_zeroes": true, 00:17:10.895 "zcopy": true, 00:17:10.895 "get_zone_info": false, 00:17:10.895 "zone_management": false, 00:17:10.895 "zone_append": false, 00:17:10.895 "compare": false, 00:17:10.895 "compare_and_write": false, 00:17:10.895 "abort": true, 00:17:10.895 "seek_hole": false, 00:17:10.895 "seek_data": false, 00:17:10.895 "copy": true, 00:17:10.895 "nvme_iov_md": false 00:17:10.895 }, 00:17:10.895 "memory_domains": [ 00:17:10.895 { 00:17:10.896 "dma_device_id": "system", 00:17:10.896 "dma_device_type": 1 00:17:10.896 }, 00:17:10.896 { 00:17:10.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.896 "dma_device_type": 2 00:17:10.896 } 00:17:10.896 ], 00:17:10.896 "driver_specific": {} 00:17:10.896 } 00:17:10.896 ] 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.896 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.154 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.154 "name": "Existed_Raid", 00:17:11.154 "uuid": "0818637d-a13b-476b-9cc9-d10f0c62c97b", 00:17:11.154 "strip_size_kb": 0, 00:17:11.154 "state": "configuring", 00:17:11.154 "raid_level": "raid1", 00:17:11.154 "superblock": true, 00:17:11.154 "num_base_bdevs": 3, 00:17:11.154 "num_base_bdevs_discovered": 1, 00:17:11.154 "num_base_bdevs_operational": 3, 00:17:11.154 "base_bdevs_list": [ 00:17:11.154 { 00:17:11.154 "name": "BaseBdev1", 00:17:11.154 "uuid": "ba0cbafa-964f-4fa8-af02-ac2d7548b76f", 00:17:11.154 "is_configured": true, 00:17:11.154 "data_offset": 2048, 00:17:11.154 "data_size": 63488 00:17:11.154 }, 00:17:11.154 { 00:17:11.154 "name": "BaseBdev2", 00:17:11.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.154 "is_configured": false, 00:17:11.154 "data_offset": 0, 00:17:11.154 "data_size": 0 00:17:11.154 }, 00:17:11.154 { 00:17:11.154 "name": "BaseBdev3", 00:17:11.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.154 "is_configured": false, 00:17:11.154 "data_offset": 0, 00:17:11.154 "data_size": 0 00:17:11.154 } 00:17:11.154 ] 00:17:11.154 }' 00:17:11.154 22:45:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.154 22:45:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.721 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:11.721 [2024-07-15 22:45:56.580098] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:11.721 [2024-07-15 22:45:56.580141] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23ea310 name Existed_Raid, state configuring 00:17:11.721 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:11.980 [2024-07-15 22:45:56.756607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:11.980 [2024-07-15 22:45:56.758055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:11.980 [2024-07-15 22:45:56.758086] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:11.980 [2024-07-15 22:45:56.758096] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:11.980 [2024-07-15 22:45:56.758108] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.980 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.239 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.239 "name": "Existed_Raid", 00:17:12.239 "uuid": "a4fd4e9d-8d61-4222-a4af-cf13c5c4413e", 00:17:12.239 "strip_size_kb": 0, 00:17:12.239 "state": "configuring", 00:17:12.239 "raid_level": "raid1", 00:17:12.239 "superblock": true, 00:17:12.239 "num_base_bdevs": 3, 00:17:12.239 "num_base_bdevs_discovered": 1, 00:17:12.239 "num_base_bdevs_operational": 3, 00:17:12.239 "base_bdevs_list": [ 00:17:12.239 { 00:17:12.239 "name": "BaseBdev1", 00:17:12.239 "uuid": "ba0cbafa-964f-4fa8-af02-ac2d7548b76f", 00:17:12.239 "is_configured": true, 00:17:12.239 "data_offset": 2048, 00:17:12.239 "data_size": 63488 00:17:12.239 }, 00:17:12.239 { 00:17:12.239 "name": "BaseBdev2", 00:17:12.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.239 "is_configured": false, 00:17:12.239 "data_offset": 0, 00:17:12.239 "data_size": 0 00:17:12.239 }, 00:17:12.239 { 00:17:12.239 "name": "BaseBdev3", 00:17:12.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.239 "is_configured": false, 00:17:12.239 "data_offset": 0, 00:17:12.239 "data_size": 0 00:17:12.239 } 00:17:12.239 ] 00:17:12.239 }' 00:17:12.239 22:45:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.239 22:45:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:12.807 22:45:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:13.065 [2024-07-15 22:45:57.802783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:13.065 BaseBdev2 00:17:13.065 22:45:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:13.065 22:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:13.065 22:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:13.065 22:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:13.065 22:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:13.065 22:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:13.065 22:45:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:13.324 22:45:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:13.583 [ 00:17:13.583 { 00:17:13.583 "name": "BaseBdev2", 00:17:13.583 "aliases": [ 00:17:13.583 "e555151e-5d64-45f1-a522-c54c689dba55" 00:17:13.583 ], 00:17:13.583 "product_name": "Malloc disk", 00:17:13.583 "block_size": 512, 00:17:13.583 "num_blocks": 65536, 00:17:13.583 "uuid": "e555151e-5d64-45f1-a522-c54c689dba55", 00:17:13.583 "assigned_rate_limits": { 00:17:13.583 "rw_ios_per_sec": 0, 00:17:13.583 "rw_mbytes_per_sec": 0, 00:17:13.583 "r_mbytes_per_sec": 0, 00:17:13.583 "w_mbytes_per_sec": 0 00:17:13.583 }, 00:17:13.583 "claimed": true, 00:17:13.583 "claim_type": "exclusive_write", 00:17:13.583 "zoned": false, 00:17:13.583 "supported_io_types": { 00:17:13.583 "read": true, 00:17:13.583 "write": true, 00:17:13.583 "unmap": true, 00:17:13.583 "flush": true, 00:17:13.583 "reset": true, 00:17:13.583 "nvme_admin": false, 00:17:13.583 "nvme_io": false, 00:17:13.583 "nvme_io_md": false, 00:17:13.583 "write_zeroes": true, 00:17:13.583 "zcopy": true, 00:17:13.583 "get_zone_info": false, 00:17:13.583 "zone_management": false, 00:17:13.583 "zone_append": false, 00:17:13.583 "compare": false, 00:17:13.583 "compare_and_write": false, 00:17:13.583 "abort": true, 00:17:13.583 "seek_hole": false, 00:17:13.583 "seek_data": false, 00:17:13.583 "copy": true, 00:17:13.583 "nvme_iov_md": false 00:17:13.583 }, 00:17:13.583 "memory_domains": [ 00:17:13.583 { 00:17:13.583 "dma_device_id": "system", 00:17:13.583 "dma_device_type": 1 00:17:13.583 }, 00:17:13.583 { 00:17:13.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.583 "dma_device_type": 2 00:17:13.583 } 00:17:13.583 ], 00:17:13.583 "driver_specific": {} 00:17:13.583 } 00:17:13.583 ] 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.583 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:13.842 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.842 "name": "Existed_Raid", 00:17:13.842 "uuid": "a4fd4e9d-8d61-4222-a4af-cf13c5c4413e", 00:17:13.842 "strip_size_kb": 0, 00:17:13.842 "state": "configuring", 00:17:13.842 "raid_level": "raid1", 00:17:13.842 "superblock": true, 00:17:13.842 "num_base_bdevs": 3, 00:17:13.842 "num_base_bdevs_discovered": 2, 00:17:13.842 "num_base_bdevs_operational": 3, 00:17:13.842 "base_bdevs_list": [ 00:17:13.842 { 00:17:13.842 "name": "BaseBdev1", 00:17:13.842 "uuid": "ba0cbafa-964f-4fa8-af02-ac2d7548b76f", 00:17:13.842 "is_configured": true, 00:17:13.842 "data_offset": 2048, 00:17:13.842 "data_size": 63488 00:17:13.842 }, 00:17:13.842 { 00:17:13.842 "name": "BaseBdev2", 00:17:13.842 "uuid": "e555151e-5d64-45f1-a522-c54c689dba55", 00:17:13.842 "is_configured": true, 00:17:13.842 "data_offset": 2048, 00:17:13.842 "data_size": 63488 00:17:13.842 }, 00:17:13.842 { 00:17:13.842 "name": "BaseBdev3", 00:17:13.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:13.842 "is_configured": false, 00:17:13.842 "data_offset": 0, 00:17:13.842 "data_size": 0 00:17:13.842 } 00:17:13.842 ] 00:17:13.842 }' 00:17:13.842 22:45:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.842 22:45:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:14.410 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:14.669 [2024-07-15 22:45:59.358431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:14.669 [2024-07-15 22:45:59.358600] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23eb400 00:17:14.669 [2024-07-15 22:45:59.358615] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:14.669 [2024-07-15 22:45:59.358789] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23eaef0 00:17:14.669 [2024-07-15 22:45:59.358909] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23eb400 00:17:14.669 [2024-07-15 22:45:59.358919] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23eb400 00:17:14.669 [2024-07-15 22:45:59.359027] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:14.669 BaseBdev3 00:17:14.669 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:14.669 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:14.669 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:14.669 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:14.669 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:14.669 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:14.669 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:14.928 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:15.187 [ 00:17:15.187 { 00:17:15.187 "name": "BaseBdev3", 00:17:15.187 "aliases": [ 00:17:15.187 "11bec769-5b9a-4602-8262-8f66138957fa" 00:17:15.187 ], 00:17:15.187 "product_name": "Malloc disk", 00:17:15.187 "block_size": 512, 00:17:15.187 "num_blocks": 65536, 00:17:15.187 "uuid": "11bec769-5b9a-4602-8262-8f66138957fa", 00:17:15.187 "assigned_rate_limits": { 00:17:15.187 "rw_ios_per_sec": 0, 00:17:15.187 "rw_mbytes_per_sec": 0, 00:17:15.187 "r_mbytes_per_sec": 0, 00:17:15.187 "w_mbytes_per_sec": 0 00:17:15.187 }, 00:17:15.187 "claimed": true, 00:17:15.187 "claim_type": "exclusive_write", 00:17:15.187 "zoned": false, 00:17:15.187 "supported_io_types": { 00:17:15.187 "read": true, 00:17:15.187 "write": true, 00:17:15.187 "unmap": true, 00:17:15.187 "flush": true, 00:17:15.187 "reset": true, 00:17:15.187 "nvme_admin": false, 00:17:15.187 "nvme_io": false, 00:17:15.187 "nvme_io_md": false, 00:17:15.187 "write_zeroes": true, 00:17:15.187 "zcopy": true, 00:17:15.187 "get_zone_info": false, 00:17:15.187 "zone_management": false, 00:17:15.187 "zone_append": false, 00:17:15.187 "compare": false, 00:17:15.187 "compare_and_write": false, 00:17:15.187 "abort": true, 00:17:15.187 "seek_hole": false, 00:17:15.187 "seek_data": false, 00:17:15.187 "copy": true, 00:17:15.187 "nvme_iov_md": false 00:17:15.187 }, 00:17:15.187 "memory_domains": [ 00:17:15.187 { 00:17:15.187 "dma_device_id": "system", 00:17:15.187 "dma_device_type": 1 00:17:15.187 }, 00:17:15.187 { 00:17:15.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.187 "dma_device_type": 2 00:17:15.187 } 00:17:15.187 ], 00:17:15.187 "driver_specific": {} 00:17:15.187 } 00:17:15.187 ] 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.187 22:45:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.447 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.447 "name": "Existed_Raid", 00:17:15.447 "uuid": "a4fd4e9d-8d61-4222-a4af-cf13c5c4413e", 00:17:15.447 "strip_size_kb": 0, 00:17:15.447 "state": "online", 00:17:15.447 "raid_level": "raid1", 00:17:15.447 "superblock": true, 00:17:15.447 "num_base_bdevs": 3, 00:17:15.447 "num_base_bdevs_discovered": 3, 00:17:15.447 "num_base_bdevs_operational": 3, 00:17:15.447 "base_bdevs_list": [ 00:17:15.447 { 00:17:15.447 "name": "BaseBdev1", 00:17:15.447 "uuid": "ba0cbafa-964f-4fa8-af02-ac2d7548b76f", 00:17:15.447 "is_configured": true, 00:17:15.447 "data_offset": 2048, 00:17:15.447 "data_size": 63488 00:17:15.447 }, 00:17:15.447 { 00:17:15.447 "name": "BaseBdev2", 00:17:15.447 "uuid": "e555151e-5d64-45f1-a522-c54c689dba55", 00:17:15.447 "is_configured": true, 00:17:15.447 "data_offset": 2048, 00:17:15.447 "data_size": 63488 00:17:15.447 }, 00:17:15.447 { 00:17:15.447 "name": "BaseBdev3", 00:17:15.447 "uuid": "11bec769-5b9a-4602-8262-8f66138957fa", 00:17:15.447 "is_configured": true, 00:17:15.447 "data_offset": 2048, 00:17:15.447 "data_size": 63488 00:17:15.447 } 00:17:15.447 ] 00:17:15.447 }' 00:17:15.447 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.447 22:46:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:16.014 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:16.273 [2024-07-15 22:46:00.958991] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:16.273 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:16.273 "name": "Existed_Raid", 00:17:16.273 "aliases": [ 00:17:16.273 "a4fd4e9d-8d61-4222-a4af-cf13c5c4413e" 00:17:16.273 ], 00:17:16.273 "product_name": "Raid Volume", 00:17:16.273 "block_size": 512, 00:17:16.273 "num_blocks": 63488, 00:17:16.273 "uuid": "a4fd4e9d-8d61-4222-a4af-cf13c5c4413e", 00:17:16.273 "assigned_rate_limits": { 00:17:16.273 "rw_ios_per_sec": 0, 00:17:16.273 "rw_mbytes_per_sec": 0, 00:17:16.273 "r_mbytes_per_sec": 0, 00:17:16.273 "w_mbytes_per_sec": 0 00:17:16.273 }, 00:17:16.273 "claimed": false, 00:17:16.273 "zoned": false, 00:17:16.273 "supported_io_types": { 00:17:16.273 "read": true, 00:17:16.273 "write": true, 00:17:16.273 "unmap": false, 00:17:16.273 "flush": false, 00:17:16.273 "reset": true, 00:17:16.273 "nvme_admin": false, 00:17:16.273 "nvme_io": false, 00:17:16.273 "nvme_io_md": false, 00:17:16.273 "write_zeroes": true, 00:17:16.273 "zcopy": false, 00:17:16.273 "get_zone_info": false, 00:17:16.273 "zone_management": false, 00:17:16.273 "zone_append": false, 00:17:16.273 "compare": false, 00:17:16.273 "compare_and_write": false, 00:17:16.273 "abort": false, 00:17:16.273 "seek_hole": false, 00:17:16.273 "seek_data": false, 00:17:16.273 "copy": false, 00:17:16.273 "nvme_iov_md": false 00:17:16.273 }, 00:17:16.273 "memory_domains": [ 00:17:16.273 { 00:17:16.273 "dma_device_id": "system", 00:17:16.273 "dma_device_type": 1 00:17:16.273 }, 00:17:16.273 { 00:17:16.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.273 "dma_device_type": 2 00:17:16.273 }, 00:17:16.273 { 00:17:16.273 "dma_device_id": "system", 00:17:16.273 "dma_device_type": 1 00:17:16.273 }, 00:17:16.273 { 00:17:16.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.273 "dma_device_type": 2 00:17:16.273 }, 00:17:16.273 { 00:17:16.273 "dma_device_id": "system", 00:17:16.273 "dma_device_type": 1 00:17:16.273 }, 00:17:16.273 { 00:17:16.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.273 "dma_device_type": 2 00:17:16.273 } 00:17:16.273 ], 00:17:16.273 "driver_specific": { 00:17:16.273 "raid": { 00:17:16.273 "uuid": "a4fd4e9d-8d61-4222-a4af-cf13c5c4413e", 00:17:16.273 "strip_size_kb": 0, 00:17:16.273 "state": "online", 00:17:16.273 "raid_level": "raid1", 00:17:16.273 "superblock": true, 00:17:16.273 "num_base_bdevs": 3, 00:17:16.273 "num_base_bdevs_discovered": 3, 00:17:16.273 "num_base_bdevs_operational": 3, 00:17:16.273 "base_bdevs_list": [ 00:17:16.273 { 00:17:16.273 "name": "BaseBdev1", 00:17:16.273 "uuid": "ba0cbafa-964f-4fa8-af02-ac2d7548b76f", 00:17:16.273 "is_configured": true, 00:17:16.273 "data_offset": 2048, 00:17:16.273 "data_size": 63488 00:17:16.273 }, 00:17:16.273 { 00:17:16.273 "name": "BaseBdev2", 00:17:16.273 "uuid": "e555151e-5d64-45f1-a522-c54c689dba55", 00:17:16.273 "is_configured": true, 00:17:16.273 "data_offset": 2048, 00:17:16.273 "data_size": 63488 00:17:16.273 }, 00:17:16.273 { 00:17:16.273 "name": "BaseBdev3", 00:17:16.273 "uuid": "11bec769-5b9a-4602-8262-8f66138957fa", 00:17:16.273 "is_configured": true, 00:17:16.273 "data_offset": 2048, 00:17:16.273 "data_size": 63488 00:17:16.273 } 00:17:16.273 ] 00:17:16.273 } 00:17:16.273 } 00:17:16.273 }' 00:17:16.273 22:46:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:16.273 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:16.273 BaseBdev2 00:17:16.273 BaseBdev3' 00:17:16.273 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.273 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:16.273 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:16.532 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:16.532 "name": "BaseBdev1", 00:17:16.532 "aliases": [ 00:17:16.532 "ba0cbafa-964f-4fa8-af02-ac2d7548b76f" 00:17:16.532 ], 00:17:16.532 "product_name": "Malloc disk", 00:17:16.532 "block_size": 512, 00:17:16.532 "num_blocks": 65536, 00:17:16.532 "uuid": "ba0cbafa-964f-4fa8-af02-ac2d7548b76f", 00:17:16.532 "assigned_rate_limits": { 00:17:16.532 "rw_ios_per_sec": 0, 00:17:16.532 "rw_mbytes_per_sec": 0, 00:17:16.532 "r_mbytes_per_sec": 0, 00:17:16.532 "w_mbytes_per_sec": 0 00:17:16.532 }, 00:17:16.532 "claimed": true, 00:17:16.532 "claim_type": "exclusive_write", 00:17:16.532 "zoned": false, 00:17:16.532 "supported_io_types": { 00:17:16.532 "read": true, 00:17:16.532 "write": true, 00:17:16.532 "unmap": true, 00:17:16.532 "flush": true, 00:17:16.532 "reset": true, 00:17:16.532 "nvme_admin": false, 00:17:16.532 "nvme_io": false, 00:17:16.532 "nvme_io_md": false, 00:17:16.532 "write_zeroes": true, 00:17:16.532 "zcopy": true, 00:17:16.532 "get_zone_info": false, 00:17:16.532 "zone_management": false, 00:17:16.532 "zone_append": false, 00:17:16.532 "compare": false, 00:17:16.532 "compare_and_write": false, 00:17:16.532 "abort": true, 00:17:16.532 "seek_hole": false, 00:17:16.532 "seek_data": false, 00:17:16.532 "copy": true, 00:17:16.532 "nvme_iov_md": false 00:17:16.532 }, 00:17:16.532 "memory_domains": [ 00:17:16.532 { 00:17:16.532 "dma_device_id": "system", 00:17:16.532 "dma_device_type": 1 00:17:16.532 }, 00:17:16.532 { 00:17:16.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.532 "dma_device_type": 2 00:17:16.532 } 00:17:16.532 ], 00:17:16.532 "driver_specific": {} 00:17:16.532 }' 00:17:16.532 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.532 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.532 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:16.532 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.532 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:16.791 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:17.050 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:17.050 "name": "BaseBdev2", 00:17:17.050 "aliases": [ 00:17:17.050 "e555151e-5d64-45f1-a522-c54c689dba55" 00:17:17.050 ], 00:17:17.050 "product_name": "Malloc disk", 00:17:17.050 "block_size": 512, 00:17:17.050 "num_blocks": 65536, 00:17:17.050 "uuid": "e555151e-5d64-45f1-a522-c54c689dba55", 00:17:17.050 "assigned_rate_limits": { 00:17:17.050 "rw_ios_per_sec": 0, 00:17:17.050 "rw_mbytes_per_sec": 0, 00:17:17.050 "r_mbytes_per_sec": 0, 00:17:17.050 "w_mbytes_per_sec": 0 00:17:17.050 }, 00:17:17.050 "claimed": true, 00:17:17.050 "claim_type": "exclusive_write", 00:17:17.050 "zoned": false, 00:17:17.050 "supported_io_types": { 00:17:17.050 "read": true, 00:17:17.050 "write": true, 00:17:17.050 "unmap": true, 00:17:17.050 "flush": true, 00:17:17.050 "reset": true, 00:17:17.050 "nvme_admin": false, 00:17:17.050 "nvme_io": false, 00:17:17.050 "nvme_io_md": false, 00:17:17.050 "write_zeroes": true, 00:17:17.050 "zcopy": true, 00:17:17.050 "get_zone_info": false, 00:17:17.050 "zone_management": false, 00:17:17.050 "zone_append": false, 00:17:17.050 "compare": false, 00:17:17.050 "compare_and_write": false, 00:17:17.050 "abort": true, 00:17:17.050 "seek_hole": false, 00:17:17.050 "seek_data": false, 00:17:17.050 "copy": true, 00:17:17.050 "nvme_iov_md": false 00:17:17.050 }, 00:17:17.051 "memory_domains": [ 00:17:17.051 { 00:17:17.051 "dma_device_id": "system", 00:17:17.051 "dma_device_type": 1 00:17:17.051 }, 00:17:17.051 { 00:17:17.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.051 "dma_device_type": 2 00:17:17.051 } 00:17:17.051 ], 00:17:17.051 "driver_specific": {} 00:17:17.051 }' 00:17:17.051 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.051 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.309 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:17.309 22:46:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.309 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.309 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:17.309 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.309 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.309 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:17.309 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.309 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.567 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:17.567 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:17.567 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:17.567 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:17.826 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:17.826 "name": "BaseBdev3", 00:17:17.826 "aliases": [ 00:17:17.826 "11bec769-5b9a-4602-8262-8f66138957fa" 00:17:17.826 ], 00:17:17.826 "product_name": "Malloc disk", 00:17:17.826 "block_size": 512, 00:17:17.826 "num_blocks": 65536, 00:17:17.826 "uuid": "11bec769-5b9a-4602-8262-8f66138957fa", 00:17:17.826 "assigned_rate_limits": { 00:17:17.826 "rw_ios_per_sec": 0, 00:17:17.826 "rw_mbytes_per_sec": 0, 00:17:17.826 "r_mbytes_per_sec": 0, 00:17:17.826 "w_mbytes_per_sec": 0 00:17:17.826 }, 00:17:17.826 "claimed": true, 00:17:17.826 "claim_type": "exclusive_write", 00:17:17.826 "zoned": false, 00:17:17.826 "supported_io_types": { 00:17:17.826 "read": true, 00:17:17.826 "write": true, 00:17:17.826 "unmap": true, 00:17:17.826 "flush": true, 00:17:17.826 "reset": true, 00:17:17.826 "nvme_admin": false, 00:17:17.826 "nvme_io": false, 00:17:17.826 "nvme_io_md": false, 00:17:17.826 "write_zeroes": true, 00:17:17.826 "zcopy": true, 00:17:17.826 "get_zone_info": false, 00:17:17.826 "zone_management": false, 00:17:17.826 "zone_append": false, 00:17:17.826 "compare": false, 00:17:17.826 "compare_and_write": false, 00:17:17.826 "abort": true, 00:17:17.826 "seek_hole": false, 00:17:17.826 "seek_data": false, 00:17:17.827 "copy": true, 00:17:17.827 "nvme_iov_md": false 00:17:17.827 }, 00:17:17.827 "memory_domains": [ 00:17:17.827 { 00:17:17.827 "dma_device_id": "system", 00:17:17.827 "dma_device_type": 1 00:17:17.827 }, 00:17:17.827 { 00:17:17.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:17.827 "dma_device_type": 2 00:17:17.827 } 00:17:17.827 ], 00:17:17.827 "driver_specific": {} 00:17:17.827 }' 00:17:17.827 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.827 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:17.827 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:17.827 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.827 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.827 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:17.827 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.085 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:18.085 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:18.085 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.085 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:18.085 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:18.085 22:46:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:18.344 [2024-07-15 22:46:03.088408] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.344 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.602 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.602 "name": "Existed_Raid", 00:17:18.602 "uuid": "a4fd4e9d-8d61-4222-a4af-cf13c5c4413e", 00:17:18.602 "strip_size_kb": 0, 00:17:18.602 "state": "online", 00:17:18.602 "raid_level": "raid1", 00:17:18.602 "superblock": true, 00:17:18.602 "num_base_bdevs": 3, 00:17:18.602 "num_base_bdevs_discovered": 2, 00:17:18.602 "num_base_bdevs_operational": 2, 00:17:18.602 "base_bdevs_list": [ 00:17:18.602 { 00:17:18.602 "name": null, 00:17:18.602 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:18.602 "is_configured": false, 00:17:18.602 "data_offset": 2048, 00:17:18.602 "data_size": 63488 00:17:18.602 }, 00:17:18.602 { 00:17:18.602 "name": "BaseBdev2", 00:17:18.602 "uuid": "e555151e-5d64-45f1-a522-c54c689dba55", 00:17:18.602 "is_configured": true, 00:17:18.602 "data_offset": 2048, 00:17:18.602 "data_size": 63488 00:17:18.602 }, 00:17:18.602 { 00:17:18.602 "name": "BaseBdev3", 00:17:18.602 "uuid": "11bec769-5b9a-4602-8262-8f66138957fa", 00:17:18.602 "is_configured": true, 00:17:18.602 "data_offset": 2048, 00:17:18.602 "data_size": 63488 00:17:18.602 } 00:17:18.602 ] 00:17:18.602 }' 00:17:18.602 22:46:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.602 22:46:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:19.168 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:19.168 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:19.168 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:19.168 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.426 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:19.426 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:19.427 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:19.685 [2024-07-15 22:46:04.486112] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:19.685 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:19.685 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:19.685 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.685 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:19.944 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:19.944 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:19.944 22:46:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:20.202 [2024-07-15 22:46:04.983597] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:20.202 [2024-07-15 22:46:04.983700] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:20.202 [2024-07-15 22:46:04.996257] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:20.202 [2024-07-15 22:46:04.996295] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:20.202 [2024-07-15 22:46:04.996308] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23eb400 name Existed_Raid, state offline 00:17:20.202 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:20.202 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:20.202 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:20.203 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.461 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:20.461 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:20.461 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:20.461 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:20.461 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:20.461 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:20.720 BaseBdev2 00:17:20.720 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:20.720 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:20.720 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:20.720 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:20.720 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:20.720 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:20.720 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.978 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:21.238 [ 00:17:21.238 { 00:17:21.238 "name": "BaseBdev2", 00:17:21.238 "aliases": [ 00:17:21.238 "d86c239d-8d24-49d8-83b8-7a1fc75b21d3" 00:17:21.238 ], 00:17:21.238 "product_name": "Malloc disk", 00:17:21.238 "block_size": 512, 00:17:21.238 "num_blocks": 65536, 00:17:21.238 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:21.238 "assigned_rate_limits": { 00:17:21.238 "rw_ios_per_sec": 0, 00:17:21.238 "rw_mbytes_per_sec": 0, 00:17:21.238 "r_mbytes_per_sec": 0, 00:17:21.238 "w_mbytes_per_sec": 0 00:17:21.238 }, 00:17:21.238 "claimed": false, 00:17:21.238 "zoned": false, 00:17:21.238 "supported_io_types": { 00:17:21.238 "read": true, 00:17:21.238 "write": true, 00:17:21.238 "unmap": true, 00:17:21.238 "flush": true, 00:17:21.238 "reset": true, 00:17:21.238 "nvme_admin": false, 00:17:21.238 "nvme_io": false, 00:17:21.238 "nvme_io_md": false, 00:17:21.238 "write_zeroes": true, 00:17:21.238 "zcopy": true, 00:17:21.238 "get_zone_info": false, 00:17:21.238 "zone_management": false, 00:17:21.238 "zone_append": false, 00:17:21.238 "compare": false, 00:17:21.238 "compare_and_write": false, 00:17:21.238 "abort": true, 00:17:21.238 "seek_hole": false, 00:17:21.238 "seek_data": false, 00:17:21.238 "copy": true, 00:17:21.238 "nvme_iov_md": false 00:17:21.238 }, 00:17:21.238 "memory_domains": [ 00:17:21.238 { 00:17:21.238 "dma_device_id": "system", 00:17:21.238 "dma_device_type": 1 00:17:21.238 }, 00:17:21.238 { 00:17:21.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.238 "dma_device_type": 2 00:17:21.238 } 00:17:21.238 ], 00:17:21.238 "driver_specific": {} 00:17:21.238 } 00:17:21.238 ] 00:17:21.238 22:46:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:21.238 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:21.238 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:21.238 22:46:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:21.497 BaseBdev3 00:17:21.497 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:21.497 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:21.497 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:21.497 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:21.497 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:21.497 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:21.497 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:21.756 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:22.015 [ 00:17:22.015 { 00:17:22.015 "name": "BaseBdev3", 00:17:22.015 "aliases": [ 00:17:22.015 "9c079844-56a9-49f5-8ba4-17feadb9d9e8" 00:17:22.015 ], 00:17:22.015 "product_name": "Malloc disk", 00:17:22.015 "block_size": 512, 00:17:22.015 "num_blocks": 65536, 00:17:22.015 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:22.015 "assigned_rate_limits": { 00:17:22.015 "rw_ios_per_sec": 0, 00:17:22.015 "rw_mbytes_per_sec": 0, 00:17:22.015 "r_mbytes_per_sec": 0, 00:17:22.015 "w_mbytes_per_sec": 0 00:17:22.015 }, 00:17:22.015 "claimed": false, 00:17:22.015 "zoned": false, 00:17:22.015 "supported_io_types": { 00:17:22.015 "read": true, 00:17:22.015 "write": true, 00:17:22.015 "unmap": true, 00:17:22.015 "flush": true, 00:17:22.015 "reset": true, 00:17:22.015 "nvme_admin": false, 00:17:22.015 "nvme_io": false, 00:17:22.015 "nvme_io_md": false, 00:17:22.015 "write_zeroes": true, 00:17:22.015 "zcopy": true, 00:17:22.015 "get_zone_info": false, 00:17:22.015 "zone_management": false, 00:17:22.015 "zone_append": false, 00:17:22.015 "compare": false, 00:17:22.015 "compare_and_write": false, 00:17:22.015 "abort": true, 00:17:22.015 "seek_hole": false, 00:17:22.015 "seek_data": false, 00:17:22.015 "copy": true, 00:17:22.015 "nvme_iov_md": false 00:17:22.015 }, 00:17:22.016 "memory_domains": [ 00:17:22.016 { 00:17:22.016 "dma_device_id": "system", 00:17:22.016 "dma_device_type": 1 00:17:22.016 }, 00:17:22.016 { 00:17:22.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.016 "dma_device_type": 2 00:17:22.016 } 00:17:22.016 ], 00:17:22.016 "driver_specific": {} 00:17:22.016 } 00:17:22.016 ] 00:17:22.016 22:46:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:22.016 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:22.016 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:22.016 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:22.275 [2024-07-15 22:46:06.951243] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:22.275 [2024-07-15 22:46:06.951286] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:22.275 [2024-07-15 22:46:06.951305] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:22.275 [2024-07-15 22:46:06.952682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:22.275 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:22.275 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.275 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:22.275 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:22.275 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:22.276 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:22.276 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.276 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.276 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.276 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.276 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.276 22:46:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.276 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.276 "name": "Existed_Raid", 00:17:22.276 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:22.276 "strip_size_kb": 0, 00:17:22.276 "state": "configuring", 00:17:22.276 "raid_level": "raid1", 00:17:22.276 "superblock": true, 00:17:22.276 "num_base_bdevs": 3, 00:17:22.276 "num_base_bdevs_discovered": 2, 00:17:22.276 "num_base_bdevs_operational": 3, 00:17:22.276 "base_bdevs_list": [ 00:17:22.276 { 00:17:22.276 "name": "BaseBdev1", 00:17:22.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.276 "is_configured": false, 00:17:22.276 "data_offset": 0, 00:17:22.276 "data_size": 0 00:17:22.276 }, 00:17:22.276 { 00:17:22.276 "name": "BaseBdev2", 00:17:22.276 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:22.276 "is_configured": true, 00:17:22.276 "data_offset": 2048, 00:17:22.276 "data_size": 63488 00:17:22.276 }, 00:17:22.276 { 00:17:22.276 "name": "BaseBdev3", 00:17:22.276 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:22.276 "is_configured": true, 00:17:22.276 "data_offset": 2048, 00:17:22.276 "data_size": 63488 00:17:22.276 } 00:17:22.276 ] 00:17:22.276 }' 00:17:22.276 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.276 22:46:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:22.884 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:23.168 [2024-07-15 22:46:07.913788] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.168 22:46:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.428 22:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.428 "name": "Existed_Raid", 00:17:23.428 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:23.428 "strip_size_kb": 0, 00:17:23.428 "state": "configuring", 00:17:23.428 "raid_level": "raid1", 00:17:23.428 "superblock": true, 00:17:23.428 "num_base_bdevs": 3, 00:17:23.428 "num_base_bdevs_discovered": 1, 00:17:23.428 "num_base_bdevs_operational": 3, 00:17:23.428 "base_bdevs_list": [ 00:17:23.428 { 00:17:23.428 "name": "BaseBdev1", 00:17:23.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.428 "is_configured": false, 00:17:23.428 "data_offset": 0, 00:17:23.428 "data_size": 0 00:17:23.428 }, 00:17:23.428 { 00:17:23.428 "name": null, 00:17:23.428 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:23.428 "is_configured": false, 00:17:23.428 "data_offset": 2048, 00:17:23.428 "data_size": 63488 00:17:23.428 }, 00:17:23.428 { 00:17:23.428 "name": "BaseBdev3", 00:17:23.428 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:23.428 "is_configured": true, 00:17:23.428 "data_offset": 2048, 00:17:23.428 "data_size": 63488 00:17:23.428 } 00:17:23.428 ] 00:17:23.428 }' 00:17:23.428 22:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.428 22:46:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:23.996 22:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.996 22:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:24.255 22:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:24.255 22:46:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:24.515 [2024-07-15 22:46:09.213918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:24.515 BaseBdev1 00:17:24.515 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:24.515 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:24.515 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:24.515 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:24.515 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:24.515 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:24.515 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:24.774 [ 00:17:24.774 { 00:17:24.774 "name": "BaseBdev1", 00:17:24.774 "aliases": [ 00:17:24.774 "a5bc66e1-b366-4407-8206-189880a9c456" 00:17:24.774 ], 00:17:24.774 "product_name": "Malloc disk", 00:17:24.774 "block_size": 512, 00:17:24.774 "num_blocks": 65536, 00:17:24.774 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:24.774 "assigned_rate_limits": { 00:17:24.774 "rw_ios_per_sec": 0, 00:17:24.774 "rw_mbytes_per_sec": 0, 00:17:24.774 "r_mbytes_per_sec": 0, 00:17:24.774 "w_mbytes_per_sec": 0 00:17:24.774 }, 00:17:24.774 "claimed": true, 00:17:24.774 "claim_type": "exclusive_write", 00:17:24.774 "zoned": false, 00:17:24.774 "supported_io_types": { 00:17:24.774 "read": true, 00:17:24.774 "write": true, 00:17:24.774 "unmap": true, 00:17:24.774 "flush": true, 00:17:24.774 "reset": true, 00:17:24.774 "nvme_admin": false, 00:17:24.774 "nvme_io": false, 00:17:24.774 "nvme_io_md": false, 00:17:24.774 "write_zeroes": true, 00:17:24.774 "zcopy": true, 00:17:24.774 "get_zone_info": false, 00:17:24.774 "zone_management": false, 00:17:24.774 "zone_append": false, 00:17:24.774 "compare": false, 00:17:24.774 "compare_and_write": false, 00:17:24.774 "abort": true, 00:17:24.774 "seek_hole": false, 00:17:24.774 "seek_data": false, 00:17:24.774 "copy": true, 00:17:24.774 "nvme_iov_md": false 00:17:24.774 }, 00:17:24.774 "memory_domains": [ 00:17:24.774 { 00:17:24.774 "dma_device_id": "system", 00:17:24.774 "dma_device_type": 1 00:17:24.774 }, 00:17:24.774 { 00:17:24.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.774 "dma_device_type": 2 00:17:24.774 } 00:17:24.774 ], 00:17:24.774 "driver_specific": {} 00:17:24.774 } 00:17:24.774 ] 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.774 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.033 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.033 "name": "Existed_Raid", 00:17:25.033 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:25.033 "strip_size_kb": 0, 00:17:25.033 "state": "configuring", 00:17:25.033 "raid_level": "raid1", 00:17:25.033 "superblock": true, 00:17:25.033 "num_base_bdevs": 3, 00:17:25.033 "num_base_bdevs_discovered": 2, 00:17:25.033 "num_base_bdevs_operational": 3, 00:17:25.033 "base_bdevs_list": [ 00:17:25.033 { 00:17:25.033 "name": "BaseBdev1", 00:17:25.033 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:25.033 "is_configured": true, 00:17:25.033 "data_offset": 2048, 00:17:25.033 "data_size": 63488 00:17:25.033 }, 00:17:25.033 { 00:17:25.033 "name": null, 00:17:25.033 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:25.033 "is_configured": false, 00:17:25.033 "data_offset": 2048, 00:17:25.033 "data_size": 63488 00:17:25.033 }, 00:17:25.033 { 00:17:25.033 "name": "BaseBdev3", 00:17:25.033 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:25.033 "is_configured": true, 00:17:25.033 "data_offset": 2048, 00:17:25.033 "data_size": 63488 00:17:25.033 } 00:17:25.033 ] 00:17:25.033 }' 00:17:25.033 22:46:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.033 22:46:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:25.600 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.601 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:25.858 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:25.858 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:26.115 [2024-07-15 22:46:10.974634] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.115 22:46:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.373 22:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.373 "name": "Existed_Raid", 00:17:26.373 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:26.373 "strip_size_kb": 0, 00:17:26.373 "state": "configuring", 00:17:26.373 "raid_level": "raid1", 00:17:26.373 "superblock": true, 00:17:26.373 "num_base_bdevs": 3, 00:17:26.373 "num_base_bdevs_discovered": 1, 00:17:26.373 "num_base_bdevs_operational": 3, 00:17:26.373 "base_bdevs_list": [ 00:17:26.373 { 00:17:26.373 "name": "BaseBdev1", 00:17:26.373 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:26.373 "is_configured": true, 00:17:26.373 "data_offset": 2048, 00:17:26.373 "data_size": 63488 00:17:26.373 }, 00:17:26.373 { 00:17:26.373 "name": null, 00:17:26.373 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:26.373 "is_configured": false, 00:17:26.373 "data_offset": 2048, 00:17:26.373 "data_size": 63488 00:17:26.373 }, 00:17:26.373 { 00:17:26.373 "name": null, 00:17:26.373 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:26.373 "is_configured": false, 00:17:26.373 "data_offset": 2048, 00:17:26.373 "data_size": 63488 00:17:26.373 } 00:17:26.373 ] 00:17:26.373 }' 00:17:26.373 22:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.373 22:46:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.938 22:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.938 22:46:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:27.194 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:27.194 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:27.451 [2024-07-15 22:46:12.250034] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:27.451 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:27.451 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.451 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.452 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.720 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.720 "name": "Existed_Raid", 00:17:27.720 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:27.720 "strip_size_kb": 0, 00:17:27.720 "state": "configuring", 00:17:27.720 "raid_level": "raid1", 00:17:27.720 "superblock": true, 00:17:27.720 "num_base_bdevs": 3, 00:17:27.720 "num_base_bdevs_discovered": 2, 00:17:27.720 "num_base_bdevs_operational": 3, 00:17:27.720 "base_bdevs_list": [ 00:17:27.720 { 00:17:27.720 "name": "BaseBdev1", 00:17:27.720 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:27.720 "is_configured": true, 00:17:27.720 "data_offset": 2048, 00:17:27.720 "data_size": 63488 00:17:27.720 }, 00:17:27.720 { 00:17:27.720 "name": null, 00:17:27.720 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:27.720 "is_configured": false, 00:17:27.720 "data_offset": 2048, 00:17:27.720 "data_size": 63488 00:17:27.720 }, 00:17:27.720 { 00:17:27.720 "name": "BaseBdev3", 00:17:27.720 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:27.720 "is_configured": true, 00:17:27.720 "data_offset": 2048, 00:17:27.720 "data_size": 63488 00:17:27.720 } 00:17:27.720 ] 00:17:27.720 }' 00:17:27.720 22:46:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.720 22:46:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:28.284 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.284 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:28.541 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:28.541 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:28.804 [2024-07-15 22:46:13.565552] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.804 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:29.068 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:29.068 "name": "Existed_Raid", 00:17:29.068 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:29.068 "strip_size_kb": 0, 00:17:29.068 "state": "configuring", 00:17:29.068 "raid_level": "raid1", 00:17:29.068 "superblock": true, 00:17:29.068 "num_base_bdevs": 3, 00:17:29.068 "num_base_bdevs_discovered": 1, 00:17:29.068 "num_base_bdevs_operational": 3, 00:17:29.068 "base_bdevs_list": [ 00:17:29.068 { 00:17:29.068 "name": null, 00:17:29.068 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:29.068 "is_configured": false, 00:17:29.068 "data_offset": 2048, 00:17:29.068 "data_size": 63488 00:17:29.068 }, 00:17:29.068 { 00:17:29.068 "name": null, 00:17:29.068 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:29.068 "is_configured": false, 00:17:29.068 "data_offset": 2048, 00:17:29.068 "data_size": 63488 00:17:29.068 }, 00:17:29.068 { 00:17:29.068 "name": "BaseBdev3", 00:17:29.068 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:29.068 "is_configured": true, 00:17:29.068 "data_offset": 2048, 00:17:29.068 "data_size": 63488 00:17:29.068 } 00:17:29.068 ] 00:17:29.068 }' 00:17:29.068 22:46:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:29.068 22:46:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:29.634 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.634 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:29.891 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:29.891 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:30.149 [2024-07-15 22:46:14.929117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:30.149 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:30.149 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:30.149 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:30.149 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:30.149 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:30.149 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:30.150 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.150 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.150 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.150 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.150 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.150 22:46:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.408 22:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.408 "name": "Existed_Raid", 00:17:30.408 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:30.408 "strip_size_kb": 0, 00:17:30.408 "state": "configuring", 00:17:30.408 "raid_level": "raid1", 00:17:30.408 "superblock": true, 00:17:30.408 "num_base_bdevs": 3, 00:17:30.408 "num_base_bdevs_discovered": 2, 00:17:30.408 "num_base_bdevs_operational": 3, 00:17:30.408 "base_bdevs_list": [ 00:17:30.408 { 00:17:30.408 "name": null, 00:17:30.408 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:30.408 "is_configured": false, 00:17:30.408 "data_offset": 2048, 00:17:30.408 "data_size": 63488 00:17:30.408 }, 00:17:30.408 { 00:17:30.408 "name": "BaseBdev2", 00:17:30.408 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:30.408 "is_configured": true, 00:17:30.408 "data_offset": 2048, 00:17:30.408 "data_size": 63488 00:17:30.408 }, 00:17:30.408 { 00:17:30.408 "name": "BaseBdev3", 00:17:30.408 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:30.408 "is_configured": true, 00:17:30.408 "data_offset": 2048, 00:17:30.408 "data_size": 63488 00:17:30.408 } 00:17:30.408 ] 00:17:30.408 }' 00:17:30.408 22:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.408 22:46:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:30.975 22:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.975 22:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:31.233 22:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:31.233 22:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.233 22:46:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:31.801 22:46:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a5bc66e1-b366-4407-8206-189880a9c456 00:17:32.061 [2024-07-15 22:46:16.730567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:32.061 [2024-07-15 22:46:16.730736] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23e11b0 00:17:32.061 [2024-07-15 22:46:16.730751] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:32.061 [2024-07-15 22:46:16.730941] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259d4f0 00:17:32.061 [2024-07-15 22:46:16.731066] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23e11b0 00:17:32.061 [2024-07-15 22:46:16.731076] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23e11b0 00:17:32.061 [2024-07-15 22:46:16.731175] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:32.061 NewBaseBdev 00:17:32.061 22:46:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:32.061 22:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:32.061 22:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:32.061 22:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:32.061 22:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:32.061 22:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:32.061 22:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:32.320 22:46:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:32.320 [ 00:17:32.320 { 00:17:32.320 "name": "NewBaseBdev", 00:17:32.320 "aliases": [ 00:17:32.320 "a5bc66e1-b366-4407-8206-189880a9c456" 00:17:32.320 ], 00:17:32.320 "product_name": "Malloc disk", 00:17:32.320 "block_size": 512, 00:17:32.320 "num_blocks": 65536, 00:17:32.320 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:32.320 "assigned_rate_limits": { 00:17:32.320 "rw_ios_per_sec": 0, 00:17:32.320 "rw_mbytes_per_sec": 0, 00:17:32.320 "r_mbytes_per_sec": 0, 00:17:32.320 "w_mbytes_per_sec": 0 00:17:32.320 }, 00:17:32.320 "claimed": true, 00:17:32.320 "claim_type": "exclusive_write", 00:17:32.320 "zoned": false, 00:17:32.320 "supported_io_types": { 00:17:32.320 "read": true, 00:17:32.320 "write": true, 00:17:32.320 "unmap": true, 00:17:32.320 "flush": true, 00:17:32.320 "reset": true, 00:17:32.320 "nvme_admin": false, 00:17:32.320 "nvme_io": false, 00:17:32.320 "nvme_io_md": false, 00:17:32.320 "write_zeroes": true, 00:17:32.320 "zcopy": true, 00:17:32.320 "get_zone_info": false, 00:17:32.320 "zone_management": false, 00:17:32.320 "zone_append": false, 00:17:32.320 "compare": false, 00:17:32.320 "compare_and_write": false, 00:17:32.320 "abort": true, 00:17:32.320 "seek_hole": false, 00:17:32.320 "seek_data": false, 00:17:32.320 "copy": true, 00:17:32.320 "nvme_iov_md": false 00:17:32.320 }, 00:17:32.320 "memory_domains": [ 00:17:32.320 { 00:17:32.320 "dma_device_id": "system", 00:17:32.320 "dma_device_type": 1 00:17:32.320 }, 00:17:32.320 { 00:17:32.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.320 "dma_device_type": 2 00:17:32.320 } 00:17:32.320 ], 00:17:32.320 "driver_specific": {} 00:17:32.320 } 00:17:32.320 ] 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:32.579 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.579 "name": "Existed_Raid", 00:17:32.579 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:32.579 "strip_size_kb": 0, 00:17:32.579 "state": "online", 00:17:32.579 "raid_level": "raid1", 00:17:32.579 "superblock": true, 00:17:32.579 "num_base_bdevs": 3, 00:17:32.579 "num_base_bdevs_discovered": 3, 00:17:32.579 "num_base_bdevs_operational": 3, 00:17:32.579 "base_bdevs_list": [ 00:17:32.579 { 00:17:32.579 "name": "NewBaseBdev", 00:17:32.579 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:32.579 "is_configured": true, 00:17:32.579 "data_offset": 2048, 00:17:32.579 "data_size": 63488 00:17:32.579 }, 00:17:32.579 { 00:17:32.579 "name": "BaseBdev2", 00:17:32.580 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:32.580 "is_configured": true, 00:17:32.580 "data_offset": 2048, 00:17:32.580 "data_size": 63488 00:17:32.580 }, 00:17:32.580 { 00:17:32.580 "name": "BaseBdev3", 00:17:32.580 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:32.580 "is_configured": true, 00:17:32.580 "data_offset": 2048, 00:17:32.580 "data_size": 63488 00:17:32.580 } 00:17:32.580 ] 00:17:32.580 }' 00:17:32.580 22:46:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.580 22:46:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:33.515 [2024-07-15 22:46:18.218828] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:33.515 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:33.515 "name": "Existed_Raid", 00:17:33.515 "aliases": [ 00:17:33.515 "b279d061-518a-4c94-a025-7bda557b20ba" 00:17:33.515 ], 00:17:33.515 "product_name": "Raid Volume", 00:17:33.515 "block_size": 512, 00:17:33.515 "num_blocks": 63488, 00:17:33.515 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:33.515 "assigned_rate_limits": { 00:17:33.515 "rw_ios_per_sec": 0, 00:17:33.515 "rw_mbytes_per_sec": 0, 00:17:33.515 "r_mbytes_per_sec": 0, 00:17:33.515 "w_mbytes_per_sec": 0 00:17:33.515 }, 00:17:33.515 "claimed": false, 00:17:33.515 "zoned": false, 00:17:33.515 "supported_io_types": { 00:17:33.515 "read": true, 00:17:33.515 "write": true, 00:17:33.515 "unmap": false, 00:17:33.515 "flush": false, 00:17:33.515 "reset": true, 00:17:33.515 "nvme_admin": false, 00:17:33.515 "nvme_io": false, 00:17:33.515 "nvme_io_md": false, 00:17:33.515 "write_zeroes": true, 00:17:33.515 "zcopy": false, 00:17:33.515 "get_zone_info": false, 00:17:33.515 "zone_management": false, 00:17:33.515 "zone_append": false, 00:17:33.515 "compare": false, 00:17:33.515 "compare_and_write": false, 00:17:33.515 "abort": false, 00:17:33.515 "seek_hole": false, 00:17:33.515 "seek_data": false, 00:17:33.515 "copy": false, 00:17:33.515 "nvme_iov_md": false 00:17:33.515 }, 00:17:33.515 "memory_domains": [ 00:17:33.515 { 00:17:33.515 "dma_device_id": "system", 00:17:33.515 "dma_device_type": 1 00:17:33.515 }, 00:17:33.515 { 00:17:33.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.515 "dma_device_type": 2 00:17:33.515 }, 00:17:33.515 { 00:17:33.515 "dma_device_id": "system", 00:17:33.515 "dma_device_type": 1 00:17:33.515 }, 00:17:33.515 { 00:17:33.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.515 "dma_device_type": 2 00:17:33.515 }, 00:17:33.515 { 00:17:33.515 "dma_device_id": "system", 00:17:33.515 "dma_device_type": 1 00:17:33.515 }, 00:17:33.515 { 00:17:33.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.515 "dma_device_type": 2 00:17:33.515 } 00:17:33.515 ], 00:17:33.515 "driver_specific": { 00:17:33.515 "raid": { 00:17:33.516 "uuid": "b279d061-518a-4c94-a025-7bda557b20ba", 00:17:33.516 "strip_size_kb": 0, 00:17:33.516 "state": "online", 00:17:33.516 "raid_level": "raid1", 00:17:33.516 "superblock": true, 00:17:33.516 "num_base_bdevs": 3, 00:17:33.516 "num_base_bdevs_discovered": 3, 00:17:33.516 "num_base_bdevs_operational": 3, 00:17:33.516 "base_bdevs_list": [ 00:17:33.516 { 00:17:33.516 "name": "NewBaseBdev", 00:17:33.516 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:33.516 "is_configured": true, 00:17:33.516 "data_offset": 2048, 00:17:33.516 "data_size": 63488 00:17:33.516 }, 00:17:33.516 { 00:17:33.516 "name": "BaseBdev2", 00:17:33.516 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:33.516 "is_configured": true, 00:17:33.516 "data_offset": 2048, 00:17:33.516 "data_size": 63488 00:17:33.516 }, 00:17:33.516 { 00:17:33.516 "name": "BaseBdev3", 00:17:33.516 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:33.516 "is_configured": true, 00:17:33.516 "data_offset": 2048, 00:17:33.516 "data_size": 63488 00:17:33.516 } 00:17:33.516 ] 00:17:33.516 } 00:17:33.516 } 00:17:33.516 }' 00:17:33.516 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:33.516 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:33.516 BaseBdev2 00:17:33.516 BaseBdev3' 00:17:33.516 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:33.516 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:33.516 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:33.775 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:33.775 "name": "NewBaseBdev", 00:17:33.775 "aliases": [ 00:17:33.775 "a5bc66e1-b366-4407-8206-189880a9c456" 00:17:33.775 ], 00:17:33.775 "product_name": "Malloc disk", 00:17:33.775 "block_size": 512, 00:17:33.775 "num_blocks": 65536, 00:17:33.775 "uuid": "a5bc66e1-b366-4407-8206-189880a9c456", 00:17:33.775 "assigned_rate_limits": { 00:17:33.775 "rw_ios_per_sec": 0, 00:17:33.775 "rw_mbytes_per_sec": 0, 00:17:33.775 "r_mbytes_per_sec": 0, 00:17:33.775 "w_mbytes_per_sec": 0 00:17:33.775 }, 00:17:33.775 "claimed": true, 00:17:33.775 "claim_type": "exclusive_write", 00:17:33.775 "zoned": false, 00:17:33.775 "supported_io_types": { 00:17:33.775 "read": true, 00:17:33.775 "write": true, 00:17:33.775 "unmap": true, 00:17:33.775 "flush": true, 00:17:33.775 "reset": true, 00:17:33.775 "nvme_admin": false, 00:17:33.775 "nvme_io": false, 00:17:33.775 "nvme_io_md": false, 00:17:33.775 "write_zeroes": true, 00:17:33.775 "zcopy": true, 00:17:33.775 "get_zone_info": false, 00:17:33.775 "zone_management": false, 00:17:33.775 "zone_append": false, 00:17:33.775 "compare": false, 00:17:33.775 "compare_and_write": false, 00:17:33.775 "abort": true, 00:17:33.775 "seek_hole": false, 00:17:33.775 "seek_data": false, 00:17:33.775 "copy": true, 00:17:33.775 "nvme_iov_md": false 00:17:33.775 }, 00:17:33.775 "memory_domains": [ 00:17:33.775 { 00:17:33.775 "dma_device_id": "system", 00:17:33.775 "dma_device_type": 1 00:17:33.775 }, 00:17:33.775 { 00:17:33.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.775 "dma_device_type": 2 00:17:33.775 } 00:17:33.775 ], 00:17:33.775 "driver_specific": {} 00:17:33.775 }' 00:17:33.775 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.775 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.775 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:33.775 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.775 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:34.034 22:46:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:34.293 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:34.293 "name": "BaseBdev2", 00:17:34.293 "aliases": [ 00:17:34.293 "d86c239d-8d24-49d8-83b8-7a1fc75b21d3" 00:17:34.293 ], 00:17:34.293 "product_name": "Malloc disk", 00:17:34.293 "block_size": 512, 00:17:34.293 "num_blocks": 65536, 00:17:34.293 "uuid": "d86c239d-8d24-49d8-83b8-7a1fc75b21d3", 00:17:34.293 "assigned_rate_limits": { 00:17:34.293 "rw_ios_per_sec": 0, 00:17:34.293 "rw_mbytes_per_sec": 0, 00:17:34.293 "r_mbytes_per_sec": 0, 00:17:34.293 "w_mbytes_per_sec": 0 00:17:34.293 }, 00:17:34.293 "claimed": true, 00:17:34.293 "claim_type": "exclusive_write", 00:17:34.293 "zoned": false, 00:17:34.293 "supported_io_types": { 00:17:34.293 "read": true, 00:17:34.293 "write": true, 00:17:34.293 "unmap": true, 00:17:34.293 "flush": true, 00:17:34.293 "reset": true, 00:17:34.293 "nvme_admin": false, 00:17:34.293 "nvme_io": false, 00:17:34.293 "nvme_io_md": false, 00:17:34.293 "write_zeroes": true, 00:17:34.293 "zcopy": true, 00:17:34.293 "get_zone_info": false, 00:17:34.293 "zone_management": false, 00:17:34.293 "zone_append": false, 00:17:34.293 "compare": false, 00:17:34.293 "compare_and_write": false, 00:17:34.293 "abort": true, 00:17:34.293 "seek_hole": false, 00:17:34.293 "seek_data": false, 00:17:34.293 "copy": true, 00:17:34.293 "nvme_iov_md": false 00:17:34.293 }, 00:17:34.293 "memory_domains": [ 00:17:34.293 { 00:17:34.293 "dma_device_id": "system", 00:17:34.293 "dma_device_type": 1 00:17:34.293 }, 00:17:34.293 { 00:17:34.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.293 "dma_device_type": 2 00:17:34.293 } 00:17:34.293 ], 00:17:34.293 "driver_specific": {} 00:17:34.293 }' 00:17:34.293 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.293 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:34.293 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:34.293 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.552 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:34.552 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:34.552 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.552 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:34.552 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:34.552 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.552 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.811 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.811 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:34.811 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:34.811 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:35.069 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:35.069 "name": "BaseBdev3", 00:17:35.069 "aliases": [ 00:17:35.069 "9c079844-56a9-49f5-8ba4-17feadb9d9e8" 00:17:35.069 ], 00:17:35.069 "product_name": "Malloc disk", 00:17:35.069 "block_size": 512, 00:17:35.069 "num_blocks": 65536, 00:17:35.069 "uuid": "9c079844-56a9-49f5-8ba4-17feadb9d9e8", 00:17:35.069 "assigned_rate_limits": { 00:17:35.069 "rw_ios_per_sec": 0, 00:17:35.069 "rw_mbytes_per_sec": 0, 00:17:35.069 "r_mbytes_per_sec": 0, 00:17:35.069 "w_mbytes_per_sec": 0 00:17:35.069 }, 00:17:35.069 "claimed": true, 00:17:35.069 "claim_type": "exclusive_write", 00:17:35.069 "zoned": false, 00:17:35.069 "supported_io_types": { 00:17:35.069 "read": true, 00:17:35.069 "write": true, 00:17:35.069 "unmap": true, 00:17:35.069 "flush": true, 00:17:35.069 "reset": true, 00:17:35.069 "nvme_admin": false, 00:17:35.069 "nvme_io": false, 00:17:35.069 "nvme_io_md": false, 00:17:35.069 "write_zeroes": true, 00:17:35.069 "zcopy": true, 00:17:35.069 "get_zone_info": false, 00:17:35.069 "zone_management": false, 00:17:35.069 "zone_append": false, 00:17:35.069 "compare": false, 00:17:35.069 "compare_and_write": false, 00:17:35.069 "abort": true, 00:17:35.069 "seek_hole": false, 00:17:35.069 "seek_data": false, 00:17:35.069 "copy": true, 00:17:35.069 "nvme_iov_md": false 00:17:35.069 }, 00:17:35.069 "memory_domains": [ 00:17:35.069 { 00:17:35.069 "dma_device_id": "system", 00:17:35.069 "dma_device_type": 1 00:17:35.069 }, 00:17:35.069 { 00:17:35.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.069 "dma_device_type": 2 00:17:35.069 } 00:17:35.069 ], 00:17:35.069 "driver_specific": {} 00:17:35.069 }' 00:17:35.069 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.069 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:35.069 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:35.069 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.069 22:46:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:35.329 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:35.589 [2024-07-15 22:46:20.408370] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:35.589 [2024-07-15 22:46:20.408398] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:35.589 [2024-07-15 22:46:20.408458] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:35.589 [2024-07-15 22:46:20.408737] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:35.589 [2024-07-15 22:46:20.408750] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e11b0 name Existed_Raid, state offline 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2750579 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2750579 ']' 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2750579 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2750579 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2750579' 00:17:35.589 killing process with pid 2750579 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2750579 00:17:35.589 [2024-07-15 22:46:20.483197] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:35.589 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2750579 00:17:35.848 [2024-07-15 22:46:20.510571] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:35.849 22:46:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:35.849 00:17:35.849 real 0m28.422s 00:17:35.849 user 0m52.322s 00:17:35.849 sys 0m4.986s 00:17:35.849 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:35.849 22:46:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:35.849 ************************************ 00:17:35.849 END TEST raid_state_function_test_sb 00:17:35.849 ************************************ 00:17:36.108 22:46:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:36.108 22:46:20 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:17:36.108 22:46:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:36.108 22:46:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:36.108 22:46:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:36.108 ************************************ 00:17:36.108 START TEST raid_superblock_test 00:17:36.108 ************************************ 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2754851 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2754851 /var/tmp/spdk-raid.sock 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2754851 ']' 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:36.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:36.108 22:46:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.108 [2024-07-15 22:46:20.883019] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:17:36.108 [2024-07-15 22:46:20.883075] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2754851 ] 00:17:36.108 [2024-07-15 22:46:20.997574] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:36.367 [2024-07-15 22:46:21.100875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:36.367 [2024-07-15 22:46:21.163356] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:36.367 [2024-07-15 22:46:21.163396] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:36.935 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:36.936 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:36.936 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:36.936 22:46:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:37.195 malloc1 00:17:37.195 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:37.453 [2024-07-15 22:46:22.172700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:37.453 [2024-07-15 22:46:22.172750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.453 [2024-07-15 22:46:22.172770] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdb570 00:17:37.453 [2024-07-15 22:46:22.172783] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.453 [2024-07-15 22:46:22.174415] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.453 [2024-07-15 22:46:22.174447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:37.453 pt1 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:37.453 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:37.453 malloc2 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:37.712 [2024-07-15 22:46:22.526355] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:37.712 [2024-07-15 22:46:22.526402] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.712 [2024-07-15 22:46:22.526425] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdc970 00:17:37.712 [2024-07-15 22:46:22.526437] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.712 [2024-07-15 22:46:22.527887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.712 [2024-07-15 22:46:22.527915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:37.712 pt2 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:37.712 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:37.971 malloc3 00:17:37.971 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:38.230 [2024-07-15 22:46:22.964109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:38.230 [2024-07-15 22:46:22.964158] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:38.230 [2024-07-15 22:46:22.964176] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe73340 00:17:38.230 [2024-07-15 22:46:22.964189] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:38.230 [2024-07-15 22:46:22.965654] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:38.230 [2024-07-15 22:46:22.965685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:38.230 pt3 00:17:38.230 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:38.230 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:38.230 22:46:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:38.490 [2024-07-15 22:46:23.148611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:38.490 [2024-07-15 22:46:23.149786] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:38.490 [2024-07-15 22:46:23.149841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:38.490 [2024-07-15 22:46:23.150001] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcd3ea0 00:17:38.490 [2024-07-15 22:46:23.150013] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:38.490 [2024-07-15 22:46:23.150201] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcdb240 00:17:38.490 [2024-07-15 22:46:23.150346] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcd3ea0 00:17:38.490 [2024-07-15 22:46:23.150356] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcd3ea0 00:17:38.490 [2024-07-15 22:46:23.150448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.490 "name": "raid_bdev1", 00:17:38.490 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:38.490 "strip_size_kb": 0, 00:17:38.490 "state": "online", 00:17:38.490 "raid_level": "raid1", 00:17:38.490 "superblock": true, 00:17:38.490 "num_base_bdevs": 3, 00:17:38.490 "num_base_bdevs_discovered": 3, 00:17:38.490 "num_base_bdevs_operational": 3, 00:17:38.490 "base_bdevs_list": [ 00:17:38.490 { 00:17:38.490 "name": "pt1", 00:17:38.490 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:38.490 "is_configured": true, 00:17:38.490 "data_offset": 2048, 00:17:38.490 "data_size": 63488 00:17:38.490 }, 00:17:38.490 { 00:17:38.490 "name": "pt2", 00:17:38.490 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:38.490 "is_configured": true, 00:17:38.490 "data_offset": 2048, 00:17:38.490 "data_size": 63488 00:17:38.490 }, 00:17:38.490 { 00:17:38.490 "name": "pt3", 00:17:38.490 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:38.490 "is_configured": true, 00:17:38.490 "data_offset": 2048, 00:17:38.490 "data_size": 63488 00:17:38.490 } 00:17:38.490 ] 00:17:38.490 }' 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.490 22:46:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:39.500 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:39.774 [2024-07-15 22:46:24.452348] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:39.774 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:39.774 "name": "raid_bdev1", 00:17:39.774 "aliases": [ 00:17:39.774 "ccfc3617-e3bb-489d-8a88-d6fd452106a9" 00:17:39.774 ], 00:17:39.774 "product_name": "Raid Volume", 00:17:39.774 "block_size": 512, 00:17:39.774 "num_blocks": 63488, 00:17:39.774 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:39.774 "assigned_rate_limits": { 00:17:39.774 "rw_ios_per_sec": 0, 00:17:39.774 "rw_mbytes_per_sec": 0, 00:17:39.774 "r_mbytes_per_sec": 0, 00:17:39.774 "w_mbytes_per_sec": 0 00:17:39.774 }, 00:17:39.774 "claimed": false, 00:17:39.774 "zoned": false, 00:17:39.774 "supported_io_types": { 00:17:39.774 "read": true, 00:17:39.774 "write": true, 00:17:39.774 "unmap": false, 00:17:39.774 "flush": false, 00:17:39.774 "reset": true, 00:17:39.774 "nvme_admin": false, 00:17:39.774 "nvme_io": false, 00:17:39.774 "nvme_io_md": false, 00:17:39.774 "write_zeroes": true, 00:17:39.774 "zcopy": false, 00:17:39.774 "get_zone_info": false, 00:17:39.774 "zone_management": false, 00:17:39.774 "zone_append": false, 00:17:39.774 "compare": false, 00:17:39.774 "compare_and_write": false, 00:17:39.774 "abort": false, 00:17:39.774 "seek_hole": false, 00:17:39.774 "seek_data": false, 00:17:39.774 "copy": false, 00:17:39.774 "nvme_iov_md": false 00:17:39.774 }, 00:17:39.774 "memory_domains": [ 00:17:39.774 { 00:17:39.774 "dma_device_id": "system", 00:17:39.774 "dma_device_type": 1 00:17:39.774 }, 00:17:39.774 { 00:17:39.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.774 "dma_device_type": 2 00:17:39.774 }, 00:17:39.774 { 00:17:39.774 "dma_device_id": "system", 00:17:39.774 "dma_device_type": 1 00:17:39.774 }, 00:17:39.774 { 00:17:39.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.774 "dma_device_type": 2 00:17:39.774 }, 00:17:39.774 { 00:17:39.774 "dma_device_id": "system", 00:17:39.774 "dma_device_type": 1 00:17:39.774 }, 00:17:39.774 { 00:17:39.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.774 "dma_device_type": 2 00:17:39.774 } 00:17:39.774 ], 00:17:39.774 "driver_specific": { 00:17:39.774 "raid": { 00:17:39.774 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:39.774 "strip_size_kb": 0, 00:17:39.774 "state": "online", 00:17:39.774 "raid_level": "raid1", 00:17:39.774 "superblock": true, 00:17:39.774 "num_base_bdevs": 3, 00:17:39.774 "num_base_bdevs_discovered": 3, 00:17:39.774 "num_base_bdevs_operational": 3, 00:17:39.774 "base_bdevs_list": [ 00:17:39.774 { 00:17:39.775 "name": "pt1", 00:17:39.775 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:39.775 "is_configured": true, 00:17:39.775 "data_offset": 2048, 00:17:39.775 "data_size": 63488 00:17:39.775 }, 00:17:39.775 { 00:17:39.775 "name": "pt2", 00:17:39.775 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:39.775 "is_configured": true, 00:17:39.775 "data_offset": 2048, 00:17:39.775 "data_size": 63488 00:17:39.775 }, 00:17:39.775 { 00:17:39.775 "name": "pt3", 00:17:39.775 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:39.775 "is_configured": true, 00:17:39.775 "data_offset": 2048, 00:17:39.775 "data_size": 63488 00:17:39.775 } 00:17:39.775 ] 00:17:39.775 } 00:17:39.775 } 00:17:39.775 }' 00:17:39.775 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:39.775 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:39.775 pt2 00:17:39.775 pt3' 00:17:39.775 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:39.775 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:39.775 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:40.034 "name": "pt1", 00:17:40.034 "aliases": [ 00:17:40.034 "00000000-0000-0000-0000-000000000001" 00:17:40.034 ], 00:17:40.034 "product_name": "passthru", 00:17:40.034 "block_size": 512, 00:17:40.034 "num_blocks": 65536, 00:17:40.034 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:40.034 "assigned_rate_limits": { 00:17:40.034 "rw_ios_per_sec": 0, 00:17:40.034 "rw_mbytes_per_sec": 0, 00:17:40.034 "r_mbytes_per_sec": 0, 00:17:40.034 "w_mbytes_per_sec": 0 00:17:40.034 }, 00:17:40.034 "claimed": true, 00:17:40.034 "claim_type": "exclusive_write", 00:17:40.034 "zoned": false, 00:17:40.034 "supported_io_types": { 00:17:40.034 "read": true, 00:17:40.034 "write": true, 00:17:40.034 "unmap": true, 00:17:40.034 "flush": true, 00:17:40.034 "reset": true, 00:17:40.034 "nvme_admin": false, 00:17:40.034 "nvme_io": false, 00:17:40.034 "nvme_io_md": false, 00:17:40.034 "write_zeroes": true, 00:17:40.034 "zcopy": true, 00:17:40.034 "get_zone_info": false, 00:17:40.034 "zone_management": false, 00:17:40.034 "zone_append": false, 00:17:40.034 "compare": false, 00:17:40.034 "compare_and_write": false, 00:17:40.034 "abort": true, 00:17:40.034 "seek_hole": false, 00:17:40.034 "seek_data": false, 00:17:40.034 "copy": true, 00:17:40.034 "nvme_iov_md": false 00:17:40.034 }, 00:17:40.034 "memory_domains": [ 00:17:40.034 { 00:17:40.034 "dma_device_id": "system", 00:17:40.034 "dma_device_type": 1 00:17:40.034 }, 00:17:40.034 { 00:17:40.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.034 "dma_device_type": 2 00:17:40.034 } 00:17:40.034 ], 00:17:40.034 "driver_specific": { 00:17:40.034 "passthru": { 00:17:40.034 "name": "pt1", 00:17:40.034 "base_bdev_name": "malloc1" 00:17:40.034 } 00:17:40.034 } 00:17:40.034 }' 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:40.034 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.293 22:46:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.293 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.293 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.293 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.293 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.293 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:40.293 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:40.293 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:40.552 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:40.552 "name": "pt2", 00:17:40.552 "aliases": [ 00:17:40.552 "00000000-0000-0000-0000-000000000002" 00:17:40.552 ], 00:17:40.552 "product_name": "passthru", 00:17:40.552 "block_size": 512, 00:17:40.552 "num_blocks": 65536, 00:17:40.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:40.552 "assigned_rate_limits": { 00:17:40.552 "rw_ios_per_sec": 0, 00:17:40.552 "rw_mbytes_per_sec": 0, 00:17:40.552 "r_mbytes_per_sec": 0, 00:17:40.552 "w_mbytes_per_sec": 0 00:17:40.552 }, 00:17:40.552 "claimed": true, 00:17:40.552 "claim_type": "exclusive_write", 00:17:40.552 "zoned": false, 00:17:40.552 "supported_io_types": { 00:17:40.552 "read": true, 00:17:40.552 "write": true, 00:17:40.552 "unmap": true, 00:17:40.552 "flush": true, 00:17:40.552 "reset": true, 00:17:40.552 "nvme_admin": false, 00:17:40.552 "nvme_io": false, 00:17:40.552 "nvme_io_md": false, 00:17:40.552 "write_zeroes": true, 00:17:40.552 "zcopy": true, 00:17:40.552 "get_zone_info": false, 00:17:40.552 "zone_management": false, 00:17:40.552 "zone_append": false, 00:17:40.552 "compare": false, 00:17:40.552 "compare_and_write": false, 00:17:40.552 "abort": true, 00:17:40.552 "seek_hole": false, 00:17:40.552 "seek_data": false, 00:17:40.552 "copy": true, 00:17:40.552 "nvme_iov_md": false 00:17:40.552 }, 00:17:40.552 "memory_domains": [ 00:17:40.552 { 00:17:40.552 "dma_device_id": "system", 00:17:40.552 "dma_device_type": 1 00:17:40.552 }, 00:17:40.552 { 00:17:40.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.552 "dma_device_type": 2 00:17:40.552 } 00:17:40.552 ], 00:17:40.552 "driver_specific": { 00:17:40.552 "passthru": { 00:17:40.552 "name": "pt2", 00:17:40.552 "base_bdev_name": "malloc2" 00:17:40.552 } 00:17:40.552 } 00:17:40.552 }' 00:17:40.552 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.552 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:40.552 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:40.552 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:40.811 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:41.070 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:41.070 "name": "pt3", 00:17:41.070 "aliases": [ 00:17:41.070 "00000000-0000-0000-0000-000000000003" 00:17:41.070 ], 00:17:41.070 "product_name": "passthru", 00:17:41.070 "block_size": 512, 00:17:41.070 "num_blocks": 65536, 00:17:41.070 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:41.070 "assigned_rate_limits": { 00:17:41.070 "rw_ios_per_sec": 0, 00:17:41.070 "rw_mbytes_per_sec": 0, 00:17:41.070 "r_mbytes_per_sec": 0, 00:17:41.070 "w_mbytes_per_sec": 0 00:17:41.070 }, 00:17:41.070 "claimed": true, 00:17:41.070 "claim_type": "exclusive_write", 00:17:41.070 "zoned": false, 00:17:41.070 "supported_io_types": { 00:17:41.070 "read": true, 00:17:41.070 "write": true, 00:17:41.070 "unmap": true, 00:17:41.070 "flush": true, 00:17:41.070 "reset": true, 00:17:41.070 "nvme_admin": false, 00:17:41.070 "nvme_io": false, 00:17:41.070 "nvme_io_md": false, 00:17:41.070 "write_zeroes": true, 00:17:41.070 "zcopy": true, 00:17:41.070 "get_zone_info": false, 00:17:41.070 "zone_management": false, 00:17:41.070 "zone_append": false, 00:17:41.070 "compare": false, 00:17:41.070 "compare_and_write": false, 00:17:41.070 "abort": true, 00:17:41.070 "seek_hole": false, 00:17:41.070 "seek_data": false, 00:17:41.070 "copy": true, 00:17:41.070 "nvme_iov_md": false 00:17:41.070 }, 00:17:41.070 "memory_domains": [ 00:17:41.070 { 00:17:41.070 "dma_device_id": "system", 00:17:41.070 "dma_device_type": 1 00:17:41.070 }, 00:17:41.070 { 00:17:41.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.070 "dma_device_type": 2 00:17:41.070 } 00:17:41.070 ], 00:17:41.070 "driver_specific": { 00:17:41.070 "passthru": { 00:17:41.070 "name": "pt3", 00:17:41.070 "base_bdev_name": "malloc3" 00:17:41.070 } 00:17:41.070 } 00:17:41.070 }' 00:17:41.070 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.329 22:46:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:41.329 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.588 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.588 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:41.588 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:41.588 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:41.847 [2024-07-15 22:46:26.525837] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:41.847 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ccfc3617-e3bb-489d-8a88-d6fd452106a9 00:17:41.847 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z ccfc3617-e3bb-489d-8a88-d6fd452106a9 ']' 00:17:41.847 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:42.106 [2024-07-15 22:46:26.770209] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:42.106 [2024-07-15 22:46:26.770231] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:42.106 [2024-07-15 22:46:26.770287] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:42.107 [2024-07-15 22:46:26.770359] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:42.107 [2024-07-15 22:46:26.770372] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd3ea0 name raid_bdev1, state offline 00:17:42.107 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.107 22:46:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:42.366 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:42.366 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:42.366 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:42.366 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:42.625 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:42.625 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:42.625 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:42.625 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:42.884 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:42.884 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:43.143 22:46:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:43.402 [2024-07-15 22:46:28.145872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:43.402 [2024-07-15 22:46:28.147224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:43.402 [2024-07-15 22:46:28.147269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:43.402 [2024-07-15 22:46:28.147316] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:43.402 [2024-07-15 22:46:28.147357] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:43.402 [2024-07-15 22:46:28.147380] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:43.402 [2024-07-15 22:46:28.147398] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:43.402 [2024-07-15 22:46:28.147408] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe7eff0 name raid_bdev1, state configuring 00:17:43.402 request: 00:17:43.402 { 00:17:43.402 "name": "raid_bdev1", 00:17:43.402 "raid_level": "raid1", 00:17:43.402 "base_bdevs": [ 00:17:43.402 "malloc1", 00:17:43.402 "malloc2", 00:17:43.402 "malloc3" 00:17:43.402 ], 00:17:43.402 "superblock": false, 00:17:43.402 "method": "bdev_raid_create", 00:17:43.402 "req_id": 1 00:17:43.402 } 00:17:43.402 Got JSON-RPC error response 00:17:43.402 response: 00:17:43.402 { 00:17:43.402 "code": -17, 00:17:43.402 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:43.402 } 00:17:43.402 22:46:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:43.402 22:46:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:43.402 22:46:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:43.402 22:46:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:43.402 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.402 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:43.662 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:43.662 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:43.662 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:43.921 [2024-07-15 22:46:28.643134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:43.921 [2024-07-15 22:46:28.643172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:43.921 [2024-07-15 22:46:28.643192] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdb7a0 00:17:43.921 [2024-07-15 22:46:28.643205] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:43.921 [2024-07-15 22:46:28.644744] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:43.921 [2024-07-15 22:46:28.644774] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:43.921 [2024-07-15 22:46:28.644837] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:43.921 [2024-07-15 22:46:28.644863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:43.921 pt1 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.921 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:44.180 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.180 "name": "raid_bdev1", 00:17:44.180 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:44.180 "strip_size_kb": 0, 00:17:44.180 "state": "configuring", 00:17:44.180 "raid_level": "raid1", 00:17:44.180 "superblock": true, 00:17:44.180 "num_base_bdevs": 3, 00:17:44.180 "num_base_bdevs_discovered": 1, 00:17:44.180 "num_base_bdevs_operational": 3, 00:17:44.180 "base_bdevs_list": [ 00:17:44.180 { 00:17:44.180 "name": "pt1", 00:17:44.180 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:44.180 "is_configured": true, 00:17:44.180 "data_offset": 2048, 00:17:44.180 "data_size": 63488 00:17:44.180 }, 00:17:44.180 { 00:17:44.180 "name": null, 00:17:44.180 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:44.180 "is_configured": false, 00:17:44.180 "data_offset": 2048, 00:17:44.180 "data_size": 63488 00:17:44.180 }, 00:17:44.180 { 00:17:44.180 "name": null, 00:17:44.180 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:44.180 "is_configured": false, 00:17:44.180 "data_offset": 2048, 00:17:44.180 "data_size": 63488 00:17:44.180 } 00:17:44.180 ] 00:17:44.180 }' 00:17:44.180 22:46:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.180 22:46:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.747 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:44.748 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:45.006 [2024-07-15 22:46:29.685918] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:45.006 [2024-07-15 22:46:29.685973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.006 [2024-07-15 22:46:29.685992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd2a10 00:17:45.006 [2024-07-15 22:46:29.686005] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.006 [2024-07-15 22:46:29.686360] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.006 [2024-07-15 22:46:29.686379] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:45.006 [2024-07-15 22:46:29.686443] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:45.006 [2024-07-15 22:46:29.686463] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:45.006 pt2 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:45.006 [2024-07-15 22:46:29.870413] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.006 22:46:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:45.265 22:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:45.265 "name": "raid_bdev1", 00:17:45.265 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:45.265 "strip_size_kb": 0, 00:17:45.265 "state": "configuring", 00:17:45.265 "raid_level": "raid1", 00:17:45.265 "superblock": true, 00:17:45.265 "num_base_bdevs": 3, 00:17:45.265 "num_base_bdevs_discovered": 1, 00:17:45.265 "num_base_bdevs_operational": 3, 00:17:45.265 "base_bdevs_list": [ 00:17:45.265 { 00:17:45.265 "name": "pt1", 00:17:45.265 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:45.265 "is_configured": true, 00:17:45.265 "data_offset": 2048, 00:17:45.265 "data_size": 63488 00:17:45.265 }, 00:17:45.265 { 00:17:45.265 "name": null, 00:17:45.265 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:45.265 "is_configured": false, 00:17:45.265 "data_offset": 2048, 00:17:45.265 "data_size": 63488 00:17:45.265 }, 00:17:45.265 { 00:17:45.265 "name": null, 00:17:45.265 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:45.265 "is_configured": false, 00:17:45.265 "data_offset": 2048, 00:17:45.265 "data_size": 63488 00:17:45.265 } 00:17:45.265 ] 00:17:45.265 }' 00:17:45.265 22:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:45.265 22:46:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.202 22:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:46.202 22:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:46.202 22:46:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:46.460 [2024-07-15 22:46:31.189949] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:46.460 [2024-07-15 22:46:31.190016] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.460 [2024-07-15 22:46:31.190041] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdba10 00:17:46.460 [2024-07-15 22:46:31.190054] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.460 [2024-07-15 22:46:31.190429] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.460 [2024-07-15 22:46:31.190450] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:46.460 [2024-07-15 22:46:31.190525] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:46.460 [2024-07-15 22:46:31.190546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:46.460 pt2 00:17:46.460 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:46.460 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:46.460 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:46.719 [2024-07-15 22:46:31.430582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:46.719 [2024-07-15 22:46:31.430616] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.719 [2024-07-15 22:46:31.430632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd26c0 00:17:46.719 [2024-07-15 22:46:31.430644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.719 [2024-07-15 22:46:31.430968] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.719 [2024-07-15 22:46:31.430988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:46.719 [2024-07-15 22:46:31.431043] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:46.719 [2024-07-15 22:46:31.431061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:46.719 [2024-07-15 22:46:31.431171] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe75c00 00:17:46.719 [2024-07-15 22:46:31.431182] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:46.719 [2024-07-15 22:46:31.431346] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd5610 00:17:46.719 [2024-07-15 22:46:31.431476] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe75c00 00:17:46.719 [2024-07-15 22:46:31.431486] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe75c00 00:17:46.719 [2024-07-15 22:46:31.431585] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:46.719 pt3 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.719 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:46.977 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.977 "name": "raid_bdev1", 00:17:46.977 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:46.977 "strip_size_kb": 0, 00:17:46.977 "state": "online", 00:17:46.977 "raid_level": "raid1", 00:17:46.977 "superblock": true, 00:17:46.977 "num_base_bdevs": 3, 00:17:46.977 "num_base_bdevs_discovered": 3, 00:17:46.977 "num_base_bdevs_operational": 3, 00:17:46.977 "base_bdevs_list": [ 00:17:46.977 { 00:17:46.977 "name": "pt1", 00:17:46.977 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:46.977 "is_configured": true, 00:17:46.977 "data_offset": 2048, 00:17:46.977 "data_size": 63488 00:17:46.977 }, 00:17:46.977 { 00:17:46.977 "name": "pt2", 00:17:46.977 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:46.977 "is_configured": true, 00:17:46.977 "data_offset": 2048, 00:17:46.977 "data_size": 63488 00:17:46.977 }, 00:17:46.977 { 00:17:46.977 "name": "pt3", 00:17:46.977 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:46.977 "is_configured": true, 00:17:46.977 "data_offset": 2048, 00:17:46.977 "data_size": 63488 00:17:46.977 } 00:17:46.977 ] 00:17:46.977 }' 00:17:46.977 22:46:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.977 22:46:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:47.543 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:47.801 [2024-07-15 22:46:32.573899] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:47.801 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:47.801 "name": "raid_bdev1", 00:17:47.801 "aliases": [ 00:17:47.801 "ccfc3617-e3bb-489d-8a88-d6fd452106a9" 00:17:47.801 ], 00:17:47.801 "product_name": "Raid Volume", 00:17:47.801 "block_size": 512, 00:17:47.801 "num_blocks": 63488, 00:17:47.801 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:47.801 "assigned_rate_limits": { 00:17:47.801 "rw_ios_per_sec": 0, 00:17:47.801 "rw_mbytes_per_sec": 0, 00:17:47.801 "r_mbytes_per_sec": 0, 00:17:47.801 "w_mbytes_per_sec": 0 00:17:47.801 }, 00:17:47.801 "claimed": false, 00:17:47.801 "zoned": false, 00:17:47.801 "supported_io_types": { 00:17:47.801 "read": true, 00:17:47.801 "write": true, 00:17:47.801 "unmap": false, 00:17:47.801 "flush": false, 00:17:47.801 "reset": true, 00:17:47.801 "nvme_admin": false, 00:17:47.801 "nvme_io": false, 00:17:47.801 "nvme_io_md": false, 00:17:47.801 "write_zeroes": true, 00:17:47.801 "zcopy": false, 00:17:47.801 "get_zone_info": false, 00:17:47.801 "zone_management": false, 00:17:47.801 "zone_append": false, 00:17:47.801 "compare": false, 00:17:47.801 "compare_and_write": false, 00:17:47.801 "abort": false, 00:17:47.801 "seek_hole": false, 00:17:47.801 "seek_data": false, 00:17:47.801 "copy": false, 00:17:47.801 "nvme_iov_md": false 00:17:47.801 }, 00:17:47.801 "memory_domains": [ 00:17:47.801 { 00:17:47.801 "dma_device_id": "system", 00:17:47.801 "dma_device_type": 1 00:17:47.801 }, 00:17:47.801 { 00:17:47.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.801 "dma_device_type": 2 00:17:47.801 }, 00:17:47.801 { 00:17:47.801 "dma_device_id": "system", 00:17:47.801 "dma_device_type": 1 00:17:47.801 }, 00:17:47.801 { 00:17:47.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.801 "dma_device_type": 2 00:17:47.801 }, 00:17:47.801 { 00:17:47.801 "dma_device_id": "system", 00:17:47.801 "dma_device_type": 1 00:17:47.801 }, 00:17:47.801 { 00:17:47.801 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.801 "dma_device_type": 2 00:17:47.801 } 00:17:47.801 ], 00:17:47.801 "driver_specific": { 00:17:47.801 "raid": { 00:17:47.801 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:47.801 "strip_size_kb": 0, 00:17:47.801 "state": "online", 00:17:47.801 "raid_level": "raid1", 00:17:47.801 "superblock": true, 00:17:47.801 "num_base_bdevs": 3, 00:17:47.801 "num_base_bdevs_discovered": 3, 00:17:47.801 "num_base_bdevs_operational": 3, 00:17:47.801 "base_bdevs_list": [ 00:17:47.801 { 00:17:47.801 "name": "pt1", 00:17:47.801 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:47.801 "is_configured": true, 00:17:47.801 "data_offset": 2048, 00:17:47.801 "data_size": 63488 00:17:47.801 }, 00:17:47.801 { 00:17:47.801 "name": "pt2", 00:17:47.801 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:47.802 "is_configured": true, 00:17:47.802 "data_offset": 2048, 00:17:47.802 "data_size": 63488 00:17:47.802 }, 00:17:47.802 { 00:17:47.802 "name": "pt3", 00:17:47.802 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:47.802 "is_configured": true, 00:17:47.802 "data_offset": 2048, 00:17:47.802 "data_size": 63488 00:17:47.802 } 00:17:47.802 ] 00:17:47.802 } 00:17:47.802 } 00:17:47.802 }' 00:17:47.802 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:47.802 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:47.802 pt2 00:17:47.802 pt3' 00:17:47.802 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.802 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:47.802 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.059 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.059 "name": "pt1", 00:17:48.059 "aliases": [ 00:17:48.059 "00000000-0000-0000-0000-000000000001" 00:17:48.059 ], 00:17:48.059 "product_name": "passthru", 00:17:48.059 "block_size": 512, 00:17:48.059 "num_blocks": 65536, 00:17:48.059 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:48.059 "assigned_rate_limits": { 00:17:48.059 "rw_ios_per_sec": 0, 00:17:48.059 "rw_mbytes_per_sec": 0, 00:17:48.059 "r_mbytes_per_sec": 0, 00:17:48.059 "w_mbytes_per_sec": 0 00:17:48.059 }, 00:17:48.059 "claimed": true, 00:17:48.059 "claim_type": "exclusive_write", 00:17:48.059 "zoned": false, 00:17:48.059 "supported_io_types": { 00:17:48.059 "read": true, 00:17:48.059 "write": true, 00:17:48.059 "unmap": true, 00:17:48.059 "flush": true, 00:17:48.059 "reset": true, 00:17:48.059 "nvme_admin": false, 00:17:48.059 "nvme_io": false, 00:17:48.059 "nvme_io_md": false, 00:17:48.059 "write_zeroes": true, 00:17:48.059 "zcopy": true, 00:17:48.059 "get_zone_info": false, 00:17:48.059 "zone_management": false, 00:17:48.059 "zone_append": false, 00:17:48.059 "compare": false, 00:17:48.059 "compare_and_write": false, 00:17:48.059 "abort": true, 00:17:48.059 "seek_hole": false, 00:17:48.059 "seek_data": false, 00:17:48.059 "copy": true, 00:17:48.059 "nvme_iov_md": false 00:17:48.059 }, 00:17:48.059 "memory_domains": [ 00:17:48.059 { 00:17:48.059 "dma_device_id": "system", 00:17:48.060 "dma_device_type": 1 00:17:48.060 }, 00:17:48.060 { 00:17:48.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.060 "dma_device_type": 2 00:17:48.060 } 00:17:48.060 ], 00:17:48.060 "driver_specific": { 00:17:48.060 "passthru": { 00:17:48.060 "name": "pt1", 00:17:48.060 "base_bdev_name": "malloc1" 00:17:48.060 } 00:17:48.060 } 00:17:48.060 }' 00:17:48.060 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.317 22:46:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.317 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.575 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.575 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.575 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.575 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:48.575 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.833 "name": "pt2", 00:17:48.833 "aliases": [ 00:17:48.833 "00000000-0000-0000-0000-000000000002" 00:17:48.833 ], 00:17:48.833 "product_name": "passthru", 00:17:48.833 "block_size": 512, 00:17:48.833 "num_blocks": 65536, 00:17:48.833 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:48.833 "assigned_rate_limits": { 00:17:48.833 "rw_ios_per_sec": 0, 00:17:48.833 "rw_mbytes_per_sec": 0, 00:17:48.833 "r_mbytes_per_sec": 0, 00:17:48.833 "w_mbytes_per_sec": 0 00:17:48.833 }, 00:17:48.833 "claimed": true, 00:17:48.833 "claim_type": "exclusive_write", 00:17:48.833 "zoned": false, 00:17:48.833 "supported_io_types": { 00:17:48.833 "read": true, 00:17:48.833 "write": true, 00:17:48.833 "unmap": true, 00:17:48.833 "flush": true, 00:17:48.833 "reset": true, 00:17:48.833 "nvme_admin": false, 00:17:48.833 "nvme_io": false, 00:17:48.833 "nvme_io_md": false, 00:17:48.833 "write_zeroes": true, 00:17:48.833 "zcopy": true, 00:17:48.833 "get_zone_info": false, 00:17:48.833 "zone_management": false, 00:17:48.833 "zone_append": false, 00:17:48.833 "compare": false, 00:17:48.833 "compare_and_write": false, 00:17:48.833 "abort": true, 00:17:48.833 "seek_hole": false, 00:17:48.833 "seek_data": false, 00:17:48.833 "copy": true, 00:17:48.833 "nvme_iov_md": false 00:17:48.833 }, 00:17:48.833 "memory_domains": [ 00:17:48.833 { 00:17:48.833 "dma_device_id": "system", 00:17:48.833 "dma_device_type": 1 00:17:48.833 }, 00:17:48.833 { 00:17:48.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.833 "dma_device_type": 2 00:17:48.833 } 00:17:48.833 ], 00:17:48.833 "driver_specific": { 00:17:48.833 "passthru": { 00:17:48.833 "name": "pt2", 00:17:48.833 "base_bdev_name": "malloc2" 00:17:48.833 } 00:17:48.833 } 00:17:48.833 }' 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.833 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:49.113 22:46:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:49.371 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:49.371 "name": "pt3", 00:17:49.371 "aliases": [ 00:17:49.371 "00000000-0000-0000-0000-000000000003" 00:17:49.371 ], 00:17:49.371 "product_name": "passthru", 00:17:49.371 "block_size": 512, 00:17:49.371 "num_blocks": 65536, 00:17:49.371 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:49.371 "assigned_rate_limits": { 00:17:49.371 "rw_ios_per_sec": 0, 00:17:49.371 "rw_mbytes_per_sec": 0, 00:17:49.371 "r_mbytes_per_sec": 0, 00:17:49.371 "w_mbytes_per_sec": 0 00:17:49.371 }, 00:17:49.371 "claimed": true, 00:17:49.371 "claim_type": "exclusive_write", 00:17:49.371 "zoned": false, 00:17:49.371 "supported_io_types": { 00:17:49.371 "read": true, 00:17:49.371 "write": true, 00:17:49.371 "unmap": true, 00:17:49.371 "flush": true, 00:17:49.371 "reset": true, 00:17:49.371 "nvme_admin": false, 00:17:49.371 "nvme_io": false, 00:17:49.371 "nvme_io_md": false, 00:17:49.371 "write_zeroes": true, 00:17:49.371 "zcopy": true, 00:17:49.371 "get_zone_info": false, 00:17:49.371 "zone_management": false, 00:17:49.371 "zone_append": false, 00:17:49.371 "compare": false, 00:17:49.371 "compare_and_write": false, 00:17:49.371 "abort": true, 00:17:49.371 "seek_hole": false, 00:17:49.371 "seek_data": false, 00:17:49.371 "copy": true, 00:17:49.371 "nvme_iov_md": false 00:17:49.371 }, 00:17:49.371 "memory_domains": [ 00:17:49.371 { 00:17:49.371 "dma_device_id": "system", 00:17:49.371 "dma_device_type": 1 00:17:49.371 }, 00:17:49.371 { 00:17:49.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.371 "dma_device_type": 2 00:17:49.371 } 00:17:49.371 ], 00:17:49.371 "driver_specific": { 00:17:49.371 "passthru": { 00:17:49.371 "name": "pt3", 00:17:49.371 "base_bdev_name": "malloc3" 00:17:49.371 } 00:17:49.371 } 00:17:49.371 }' 00:17:49.371 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.371 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:49.371 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:49.371 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.371 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:49.629 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:49.887 [2024-07-15 22:46:34.691504] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:49.887 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' ccfc3617-e3bb-489d-8a88-d6fd452106a9 '!=' ccfc3617-e3bb-489d-8a88-d6fd452106a9 ']' 00:17:49.887 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:49.887 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:49.887 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:49.887 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:50.145 [2024-07-15 22:46:34.939917] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.145 22:46:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:50.402 22:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.402 "name": "raid_bdev1", 00:17:50.402 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:50.402 "strip_size_kb": 0, 00:17:50.402 "state": "online", 00:17:50.402 "raid_level": "raid1", 00:17:50.402 "superblock": true, 00:17:50.402 "num_base_bdevs": 3, 00:17:50.402 "num_base_bdevs_discovered": 2, 00:17:50.402 "num_base_bdevs_operational": 2, 00:17:50.402 "base_bdevs_list": [ 00:17:50.402 { 00:17:50.402 "name": null, 00:17:50.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.402 "is_configured": false, 00:17:50.402 "data_offset": 2048, 00:17:50.402 "data_size": 63488 00:17:50.402 }, 00:17:50.402 { 00:17:50.402 "name": "pt2", 00:17:50.402 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:50.402 "is_configured": true, 00:17:50.402 "data_offset": 2048, 00:17:50.402 "data_size": 63488 00:17:50.402 }, 00:17:50.402 { 00:17:50.402 "name": "pt3", 00:17:50.402 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:50.402 "is_configured": true, 00:17:50.402 "data_offset": 2048, 00:17:50.402 "data_size": 63488 00:17:50.402 } 00:17:50.402 ] 00:17:50.402 }' 00:17:50.402 22:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.402 22:46:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.010 22:46:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:51.268 [2024-07-15 22:46:36.014748] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:51.268 [2024-07-15 22:46:36.014773] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:51.268 [2024-07-15 22:46:36.014820] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:51.268 [2024-07-15 22:46:36.014871] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:51.268 [2024-07-15 22:46:36.014883] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe75c00 name raid_bdev1, state offline 00:17:51.268 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.268 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:51.525 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:51.525 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:51.525 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:51.525 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:51.525 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:51.782 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:52.040 [2024-07-15 22:46:36.868957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:52.040 [2024-07-15 22:46:36.869002] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:52.040 [2024-07-15 22:46:36.869019] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd3310 00:17:52.040 [2024-07-15 22:46:36.869032] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:52.040 [2024-07-15 22:46:36.870632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:52.040 [2024-07-15 22:46:36.870663] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:52.040 [2024-07-15 22:46:36.870733] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:52.040 [2024-07-15 22:46:36.870759] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:52.040 pt2 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.040 22:46:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:52.298 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.298 "name": "raid_bdev1", 00:17:52.298 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:52.298 "strip_size_kb": 0, 00:17:52.298 "state": "configuring", 00:17:52.298 "raid_level": "raid1", 00:17:52.298 "superblock": true, 00:17:52.298 "num_base_bdevs": 3, 00:17:52.298 "num_base_bdevs_discovered": 1, 00:17:52.298 "num_base_bdevs_operational": 2, 00:17:52.298 "base_bdevs_list": [ 00:17:52.298 { 00:17:52.298 "name": null, 00:17:52.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.298 "is_configured": false, 00:17:52.298 "data_offset": 2048, 00:17:52.298 "data_size": 63488 00:17:52.298 }, 00:17:52.298 { 00:17:52.298 "name": "pt2", 00:17:52.298 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:52.298 "is_configured": true, 00:17:52.298 "data_offset": 2048, 00:17:52.298 "data_size": 63488 00:17:52.298 }, 00:17:52.298 { 00:17:52.298 "name": null, 00:17:52.298 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:52.298 "is_configured": false, 00:17:52.298 "data_offset": 2048, 00:17:52.298 "data_size": 63488 00:17:52.298 } 00:17:52.298 ] 00:17:52.298 }' 00:17:52.298 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.298 22:46:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.865 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:52.865 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:52.865 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:52.865 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:53.124 [2024-07-15 22:46:37.811459] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:53.124 [2024-07-15 22:46:37.811508] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:53.124 [2024-07-15 22:46:37.811529] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd1ec0 00:17:53.124 [2024-07-15 22:46:37.811541] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:53.124 [2024-07-15 22:46:37.811893] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:53.124 [2024-07-15 22:46:37.811912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:53.124 [2024-07-15 22:46:37.811985] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:53.124 [2024-07-15 22:46:37.812004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:53.124 [2024-07-15 22:46:37.812106] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe73cc0 00:17:53.124 [2024-07-15 22:46:37.812117] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:53.124 [2024-07-15 22:46:37.812283] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe746d0 00:17:53.124 [2024-07-15 22:46:37.812408] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe73cc0 00:17:53.124 [2024-07-15 22:46:37.812419] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe73cc0 00:17:53.124 [2024-07-15 22:46:37.812515] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:53.124 pt3 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:53.124 22:46:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.383 22:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.383 "name": "raid_bdev1", 00:17:53.383 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:53.383 "strip_size_kb": 0, 00:17:53.383 "state": "online", 00:17:53.383 "raid_level": "raid1", 00:17:53.383 "superblock": true, 00:17:53.383 "num_base_bdevs": 3, 00:17:53.383 "num_base_bdevs_discovered": 2, 00:17:53.383 "num_base_bdevs_operational": 2, 00:17:53.383 "base_bdevs_list": [ 00:17:53.383 { 00:17:53.383 "name": null, 00:17:53.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.383 "is_configured": false, 00:17:53.383 "data_offset": 2048, 00:17:53.383 "data_size": 63488 00:17:53.383 }, 00:17:53.383 { 00:17:53.383 "name": "pt2", 00:17:53.383 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:53.383 "is_configured": true, 00:17:53.383 "data_offset": 2048, 00:17:53.383 "data_size": 63488 00:17:53.383 }, 00:17:53.383 { 00:17:53.383 "name": "pt3", 00:17:53.383 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:53.383 "is_configured": true, 00:17:53.383 "data_offset": 2048, 00:17:53.383 "data_size": 63488 00:17:53.383 } 00:17:53.383 ] 00:17:53.383 }' 00:17:53.383 22:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.383 22:46:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.949 22:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:54.207 [2024-07-15 22:46:38.922408] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:54.207 [2024-07-15 22:46:38.922433] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:54.207 [2024-07-15 22:46:38.922491] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:54.207 [2024-07-15 22:46:38.922545] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:54.207 [2024-07-15 22:46:38.922557] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe73cc0 name raid_bdev1, state offline 00:17:54.207 22:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.207 22:46:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:54.466 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:54.466 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:54.466 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:54.466 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:54.466 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:54.466 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:54.725 [2024-07-15 22:46:39.483876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:54.725 [2024-07-15 22:46:39.483920] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.725 [2024-07-15 22:46:39.483944] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd1ec0 00:17:54.725 [2024-07-15 22:46:39.483957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.725 [2024-07-15 22:46:39.485564] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.725 [2024-07-15 22:46:39.485594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:54.725 [2024-07-15 22:46:39.485660] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:54.725 [2024-07-15 22:46:39.485686] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:54.725 [2024-07-15 22:46:39.485780] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:54.725 [2024-07-15 22:46:39.485794] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:54.725 [2024-07-15 22:46:39.485807] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe73f40 name raid_bdev1, state configuring 00:17:54.725 [2024-07-15 22:46:39.485830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:54.725 pt1 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:54.725 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.983 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.983 "name": "raid_bdev1", 00:17:54.983 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:54.983 "strip_size_kb": 0, 00:17:54.983 "state": "configuring", 00:17:54.983 "raid_level": "raid1", 00:17:54.983 "superblock": true, 00:17:54.983 "num_base_bdevs": 3, 00:17:54.983 "num_base_bdevs_discovered": 1, 00:17:54.983 "num_base_bdevs_operational": 2, 00:17:54.983 "base_bdevs_list": [ 00:17:54.983 { 00:17:54.983 "name": null, 00:17:54.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.983 "is_configured": false, 00:17:54.983 "data_offset": 2048, 00:17:54.983 "data_size": 63488 00:17:54.983 }, 00:17:54.983 { 00:17:54.983 "name": "pt2", 00:17:54.983 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:54.983 "is_configured": true, 00:17:54.983 "data_offset": 2048, 00:17:54.983 "data_size": 63488 00:17:54.983 }, 00:17:54.983 { 00:17:54.983 "name": null, 00:17:54.983 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:54.983 "is_configured": false, 00:17:54.983 "data_offset": 2048, 00:17:54.983 "data_size": 63488 00:17:54.983 } 00:17:54.983 ] 00:17:54.983 }' 00:17:54.983 22:46:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.983 22:46:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.549 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:55.549 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:55.809 [2024-07-15 22:46:40.687088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:55.809 [2024-07-15 22:46:40.687156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:55.809 [2024-07-15 22:46:40.687177] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd50c0 00:17:55.809 [2024-07-15 22:46:40.687190] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:55.809 [2024-07-15 22:46:40.687572] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:55.809 [2024-07-15 22:46:40.687592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:55.809 [2024-07-15 22:46:40.687664] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:55.809 [2024-07-15 22:46:40.687684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:55.809 [2024-07-15 22:46:40.687794] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcd5a40 00:17:55.809 [2024-07-15 22:46:40.687805] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:55.809 [2024-07-15 22:46:40.687991] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe746c0 00:17:55.809 [2024-07-15 22:46:40.688123] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcd5a40 00:17:55.809 [2024-07-15 22:46:40.688133] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcd5a40 00:17:55.809 [2024-07-15 22:46:40.688234] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:55.809 pt3 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.809 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:56.101 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.101 "name": "raid_bdev1", 00:17:56.101 "uuid": "ccfc3617-e3bb-489d-8a88-d6fd452106a9", 00:17:56.101 "strip_size_kb": 0, 00:17:56.101 "state": "online", 00:17:56.101 "raid_level": "raid1", 00:17:56.101 "superblock": true, 00:17:56.101 "num_base_bdevs": 3, 00:17:56.101 "num_base_bdevs_discovered": 2, 00:17:56.101 "num_base_bdevs_operational": 2, 00:17:56.101 "base_bdevs_list": [ 00:17:56.101 { 00:17:56.101 "name": null, 00:17:56.101 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.101 "is_configured": false, 00:17:56.101 "data_offset": 2048, 00:17:56.101 "data_size": 63488 00:17:56.101 }, 00:17:56.101 { 00:17:56.101 "name": "pt2", 00:17:56.101 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:56.101 "is_configured": true, 00:17:56.101 "data_offset": 2048, 00:17:56.101 "data_size": 63488 00:17:56.101 }, 00:17:56.101 { 00:17:56.101 "name": "pt3", 00:17:56.101 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:56.101 "is_configured": true, 00:17:56.101 "data_offset": 2048, 00:17:56.101 "data_size": 63488 00:17:56.101 } 00:17:56.101 ] 00:17:56.101 }' 00:17:56.101 22:46:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.101 22:46:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.037 22:46:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:57.037 22:46:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:57.295 22:46:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:57.295 22:46:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:57.295 22:46:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:57.553 [2024-07-15 22:46:42.223410] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' ccfc3617-e3bb-489d-8a88-d6fd452106a9 '!=' ccfc3617-e3bb-489d-8a88-d6fd452106a9 ']' 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2754851 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2754851 ']' 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2754851 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2754851 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2754851' 00:17:57.553 killing process with pid 2754851 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2754851 00:17:57.553 [2024-07-15 22:46:42.314887] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:57.553 [2024-07-15 22:46:42.314952] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:57.553 [2024-07-15 22:46:42.315010] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:57.553 [2024-07-15 22:46:42.315023] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd5a40 name raid_bdev1, state offline 00:17:57.553 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2754851 00:17:57.553 [2024-07-15 22:46:42.344796] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:57.812 22:46:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:57.812 00:17:57.812 real 0m21.740s 00:17:57.812 user 0m39.607s 00:17:57.812 sys 0m3.975s 00:17:57.812 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:57.812 22:46:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.812 ************************************ 00:17:57.812 END TEST raid_superblock_test 00:17:57.812 ************************************ 00:17:57.812 22:46:42 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:57.812 22:46:42 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:57.812 22:46:42 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:57.812 22:46:42 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:57.812 22:46:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:57.812 ************************************ 00:17:57.812 START TEST raid_read_error_test 00:17:57.812 ************************************ 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.adMv4r4r4U 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2758121 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2758121 /var/tmp/spdk-raid.sock 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2758121 ']' 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:57.812 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:57.812 22:46:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.072 [2024-07-15 22:46:42.722352] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:17:58.072 [2024-07-15 22:46:42.722420] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2758121 ] 00:17:58.072 [2024-07-15 22:46:42.852780] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:58.072 [2024-07-15 22:46:42.959711] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:58.331 [2024-07-15 22:46:43.024005] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:58.331 [2024-07-15 22:46:43.024041] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:58.897 22:46:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:58.897 22:46:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:58.897 22:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:58.897 22:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:59.156 BaseBdev1_malloc 00:17:59.156 22:46:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:59.414 true 00:17:59.414 22:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:59.674 [2024-07-15 22:46:44.377093] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:59.674 [2024-07-15 22:46:44.377140] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:59.674 [2024-07-15 22:46:44.377163] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16930d0 00:17:59.674 [2024-07-15 22:46:44.377176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:59.674 [2024-07-15 22:46:44.379091] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:59.674 [2024-07-15 22:46:44.379123] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:59.674 BaseBdev1 00:17:59.674 22:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:59.674 22:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:59.933 BaseBdev2_malloc 00:17:59.934 22:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:00.192 true 00:18:00.192 22:46:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:00.452 [2024-07-15 22:46:45.112896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:00.452 [2024-07-15 22:46:45.112946] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:00.452 [2024-07-15 22:46:45.112968] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1697910 00:18:00.452 [2024-07-15 22:46:45.112981] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:00.452 [2024-07-15 22:46:45.114531] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:00.452 [2024-07-15 22:46:45.114561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:00.452 BaseBdev2 00:18:00.452 22:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:00.452 22:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:00.712 BaseBdev3_malloc 00:18:00.712 22:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:00.712 true 00:18:00.712 22:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:00.971 [2024-07-15 22:46:45.839425] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:00.971 [2024-07-15 22:46:45.839472] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:00.971 [2024-07-15 22:46:45.839495] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1699bd0 00:18:00.971 [2024-07-15 22:46:45.839507] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:00.971 [2024-07-15 22:46:45.841103] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:00.971 [2024-07-15 22:46:45.841134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:00.971 BaseBdev3 00:18:00.971 22:46:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:01.230 [2024-07-15 22:46:46.080091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:01.230 [2024-07-15 22:46:46.081449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:01.230 [2024-07-15 22:46:46.081518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:01.230 [2024-07-15 22:46:46.081731] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x169b280 00:18:01.230 [2024-07-15 22:46:46.081743] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:01.230 [2024-07-15 22:46:46.081956] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169ae20 00:18:01.230 [2024-07-15 22:46:46.082111] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x169b280 00:18:01.230 [2024-07-15 22:46:46.082121] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x169b280 00:18:01.230 [2024-07-15 22:46:46.082227] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:01.230 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.231 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:01.490 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.490 "name": "raid_bdev1", 00:18:01.490 "uuid": "139d7da9-ce83-40ef-9d03-91c8a8611e50", 00:18:01.490 "strip_size_kb": 0, 00:18:01.490 "state": "online", 00:18:01.490 "raid_level": "raid1", 00:18:01.490 "superblock": true, 00:18:01.490 "num_base_bdevs": 3, 00:18:01.490 "num_base_bdevs_discovered": 3, 00:18:01.490 "num_base_bdevs_operational": 3, 00:18:01.490 "base_bdevs_list": [ 00:18:01.490 { 00:18:01.490 "name": "BaseBdev1", 00:18:01.490 "uuid": "2a4fa690-0376-549a-9692-f40aef613d08", 00:18:01.490 "is_configured": true, 00:18:01.490 "data_offset": 2048, 00:18:01.490 "data_size": 63488 00:18:01.490 }, 00:18:01.490 { 00:18:01.490 "name": "BaseBdev2", 00:18:01.490 "uuid": "b811603a-9d94-575c-b313-c01b8f6b7358", 00:18:01.490 "is_configured": true, 00:18:01.490 "data_offset": 2048, 00:18:01.490 "data_size": 63488 00:18:01.490 }, 00:18:01.490 { 00:18:01.490 "name": "BaseBdev3", 00:18:01.490 "uuid": "51e7ba84-1a27-5aa0-804e-dcd1ad63b7b7", 00:18:01.490 "is_configured": true, 00:18:01.490 "data_offset": 2048, 00:18:01.490 "data_size": 63488 00:18:01.490 } 00:18:01.490 ] 00:18:01.490 }' 00:18:01.490 22:46:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.490 22:46:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.427 22:46:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:02.427 22:46:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:02.427 [2024-07-15 22:46:47.131141] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14e8e00 00:18:03.365 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.626 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:03.885 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.885 "name": "raid_bdev1", 00:18:03.885 "uuid": "139d7da9-ce83-40ef-9d03-91c8a8611e50", 00:18:03.885 "strip_size_kb": 0, 00:18:03.885 "state": "online", 00:18:03.885 "raid_level": "raid1", 00:18:03.885 "superblock": true, 00:18:03.885 "num_base_bdevs": 3, 00:18:03.885 "num_base_bdevs_discovered": 3, 00:18:03.886 "num_base_bdevs_operational": 3, 00:18:03.886 "base_bdevs_list": [ 00:18:03.886 { 00:18:03.886 "name": "BaseBdev1", 00:18:03.886 "uuid": "2a4fa690-0376-549a-9692-f40aef613d08", 00:18:03.886 "is_configured": true, 00:18:03.886 "data_offset": 2048, 00:18:03.886 "data_size": 63488 00:18:03.886 }, 00:18:03.886 { 00:18:03.886 "name": "BaseBdev2", 00:18:03.886 "uuid": "b811603a-9d94-575c-b313-c01b8f6b7358", 00:18:03.886 "is_configured": true, 00:18:03.886 "data_offset": 2048, 00:18:03.886 "data_size": 63488 00:18:03.886 }, 00:18:03.886 { 00:18:03.886 "name": "BaseBdev3", 00:18:03.886 "uuid": "51e7ba84-1a27-5aa0-804e-dcd1ad63b7b7", 00:18:03.886 "is_configured": true, 00:18:03.886 "data_offset": 2048, 00:18:03.886 "data_size": 63488 00:18:03.886 } 00:18:03.886 ] 00:18:03.886 }' 00:18:03.886 22:46:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.886 22:46:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.454 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:04.714 [2024-07-15 22:46:49.384164] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:04.714 [2024-07-15 22:46:49.384200] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:04.714 [2024-07-15 22:46:49.387365] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:04.714 [2024-07-15 22:46:49.387399] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:04.714 [2024-07-15 22:46:49.387496] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:04.714 [2024-07-15 22:46:49.387508] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169b280 name raid_bdev1, state offline 00:18:04.714 0 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2758121 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2758121 ']' 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2758121 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2758121 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2758121' 00:18:04.714 killing process with pid 2758121 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2758121 00:18:04.714 [2024-07-15 22:46:49.469741] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:04.714 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2758121 00:18:04.714 [2024-07-15 22:46:49.490916] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.adMv4r4r4U 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:04.974 00:18:04.974 real 0m7.084s 00:18:04.974 user 0m11.178s 00:18:04.974 sys 0m1.303s 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:04.974 22:46:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.974 ************************************ 00:18:04.974 END TEST raid_read_error_test 00:18:04.974 ************************************ 00:18:04.974 22:46:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:04.974 22:46:49 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:18:04.974 22:46:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:04.974 22:46:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:04.974 22:46:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:04.974 ************************************ 00:18:04.974 START TEST raid_write_error_test 00:18:04.974 ************************************ 00:18:04.974 22:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:18:04.974 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.w4gwHbpZBq 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2759135 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2759135 /var/tmp/spdk-raid.sock 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2759135 ']' 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:04.975 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:04.975 22:46:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:05.234 [2024-07-15 22:46:49.890256] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:18:05.234 [2024-07-15 22:46:49.890315] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2759135 ] 00:18:05.234 [2024-07-15 22:46:50.004176] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:05.234 [2024-07-15 22:46:50.114229] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:05.492 [2024-07-15 22:46:50.171165] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:05.492 [2024-07-15 22:46:50.171201] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:06.059 22:46:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:06.059 22:46:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:06.059 22:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:06.059 22:46:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:06.318 BaseBdev1_malloc 00:18:06.318 22:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:06.576 true 00:18:06.576 22:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:06.835 [2024-07-15 22:46:51.619348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:06.835 [2024-07-15 22:46:51.619397] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:06.835 [2024-07-15 22:46:51.619417] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10090d0 00:18:06.835 [2024-07-15 22:46:51.619430] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:06.835 [2024-07-15 22:46:51.621213] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:06.835 [2024-07-15 22:46:51.621248] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:06.835 BaseBdev1 00:18:06.835 22:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:06.835 22:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:07.094 BaseBdev2_malloc 00:18:07.094 22:46:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:07.353 true 00:18:07.353 22:46:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:07.610 [2024-07-15 22:46:52.357858] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:07.610 [2024-07-15 22:46:52.357904] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:07.610 [2024-07-15 22:46:52.357922] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x100d910 00:18:07.610 [2024-07-15 22:46:52.357942] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:07.610 [2024-07-15 22:46:52.359331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:07.610 [2024-07-15 22:46:52.359360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:07.610 BaseBdev2 00:18:07.610 22:46:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:07.610 22:46:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:07.868 BaseBdev3_malloc 00:18:07.869 22:46:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:08.435 true 00:18:08.435 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:08.694 [2024-07-15 22:46:53.377093] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:08.694 [2024-07-15 22:46:53.377141] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.694 [2024-07-15 22:46:53.377161] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x100fbd0 00:18:08.694 [2024-07-15 22:46:53.377175] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.694 [2024-07-15 22:46:53.378627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.694 [2024-07-15 22:46:53.378658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:08.694 BaseBdev3 00:18:08.694 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:08.952 [2024-07-15 22:46:53.621761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:08.952 [2024-07-15 22:46:53.623008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:08.952 [2024-07-15 22:46:53.623079] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:08.952 [2024-07-15 22:46:53.623290] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1011280 00:18:08.952 [2024-07-15 22:46:53.623302] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:08.952 [2024-07-15 22:46:53.623495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1010e20 00:18:08.952 [2024-07-15 22:46:53.623648] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1011280 00:18:08.952 [2024-07-15 22:46:53.623659] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1011280 00:18:08.952 [2024-07-15 22:46:53.623767] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.952 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:09.211 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.211 "name": "raid_bdev1", 00:18:09.211 "uuid": "858fc6e5-5370-49f7-8a6e-4072536fedc5", 00:18:09.211 "strip_size_kb": 0, 00:18:09.211 "state": "online", 00:18:09.211 "raid_level": "raid1", 00:18:09.211 "superblock": true, 00:18:09.211 "num_base_bdevs": 3, 00:18:09.211 "num_base_bdevs_discovered": 3, 00:18:09.211 "num_base_bdevs_operational": 3, 00:18:09.211 "base_bdevs_list": [ 00:18:09.211 { 00:18:09.211 "name": "BaseBdev1", 00:18:09.211 "uuid": "779e953f-8e17-537b-bf10-0e4958c7104e", 00:18:09.211 "is_configured": true, 00:18:09.211 "data_offset": 2048, 00:18:09.211 "data_size": 63488 00:18:09.211 }, 00:18:09.211 { 00:18:09.211 "name": "BaseBdev2", 00:18:09.211 "uuid": "b48da194-1c4f-59f4-a512-63da6af37798", 00:18:09.211 "is_configured": true, 00:18:09.211 "data_offset": 2048, 00:18:09.211 "data_size": 63488 00:18:09.211 }, 00:18:09.211 { 00:18:09.211 "name": "BaseBdev3", 00:18:09.211 "uuid": "10f9e6b0-43d8-59de-9082-2813f6cbe5e1", 00:18:09.211 "is_configured": true, 00:18:09.211 "data_offset": 2048, 00:18:09.211 "data_size": 63488 00:18:09.211 } 00:18:09.211 ] 00:18:09.211 }' 00:18:09.211 22:46:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.211 22:46:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.777 22:46:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:09.777 22:46:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:09.777 [2024-07-15 22:46:54.644760] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe5ee00 00:18:10.714 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:10.974 [2024-07-15 22:46:55.769138] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:10.974 [2024-07-15 22:46:55.769202] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:10.974 [2024-07-15 22:46:55.769397] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe5ee00 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.974 22:46:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:11.233 22:46:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.233 "name": "raid_bdev1", 00:18:11.233 "uuid": "858fc6e5-5370-49f7-8a6e-4072536fedc5", 00:18:11.233 "strip_size_kb": 0, 00:18:11.233 "state": "online", 00:18:11.233 "raid_level": "raid1", 00:18:11.233 "superblock": true, 00:18:11.233 "num_base_bdevs": 3, 00:18:11.233 "num_base_bdevs_discovered": 2, 00:18:11.233 "num_base_bdevs_operational": 2, 00:18:11.233 "base_bdevs_list": [ 00:18:11.233 { 00:18:11.233 "name": null, 00:18:11.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.233 "is_configured": false, 00:18:11.233 "data_offset": 2048, 00:18:11.233 "data_size": 63488 00:18:11.233 }, 00:18:11.233 { 00:18:11.233 "name": "BaseBdev2", 00:18:11.233 "uuid": "b48da194-1c4f-59f4-a512-63da6af37798", 00:18:11.233 "is_configured": true, 00:18:11.233 "data_offset": 2048, 00:18:11.233 "data_size": 63488 00:18:11.233 }, 00:18:11.233 { 00:18:11.233 "name": "BaseBdev3", 00:18:11.233 "uuid": "10f9e6b0-43d8-59de-9082-2813f6cbe5e1", 00:18:11.233 "is_configured": true, 00:18:11.233 "data_offset": 2048, 00:18:11.233 "data_size": 63488 00:18:11.233 } 00:18:11.233 ] 00:18:11.233 }' 00:18:11.233 22:46:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.233 22:46:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.171 22:46:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:12.739 [2024-07-15 22:46:57.403239] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:12.739 [2024-07-15 22:46:57.403283] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:12.739 [2024-07-15 22:46:57.406431] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:12.739 [2024-07-15 22:46:57.406465] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:12.739 [2024-07-15 22:46:57.406539] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:12.739 [2024-07-15 22:46:57.406552] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1011280 name raid_bdev1, state offline 00:18:12.739 0 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2759135 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2759135 ']' 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2759135 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2759135 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2759135' 00:18:12.739 killing process with pid 2759135 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2759135 00:18:12.739 [2024-07-15 22:46:57.488269] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:12.739 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2759135 00:18:12.739 [2024-07-15 22:46:57.512015] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.w4gwHbpZBq 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:12.999 00:18:12.999 real 0m7.927s 00:18:12.999 user 0m12.863s 00:18:12.999 sys 0m1.314s 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:12.999 22:46:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.999 ************************************ 00:18:12.999 END TEST raid_write_error_test 00:18:12.999 ************************************ 00:18:12.999 22:46:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:12.999 22:46:57 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:18:12.999 22:46:57 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:12.999 22:46:57 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:18:12.999 22:46:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:12.999 22:46:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:12.999 22:46:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:12.999 ************************************ 00:18:12.999 START TEST raid_state_function_test 00:18:12.999 ************************************ 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:12.999 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2760245 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2760245' 00:18:13.000 Process raid pid: 2760245 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2760245 /var/tmp/spdk-raid.sock 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2760245 ']' 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:13.000 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:13.000 22:46:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.000 [2024-07-15 22:46:57.896111] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:18:13.000 [2024-07-15 22:46:57.896177] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:13.336 [2024-07-15 22:46:58.027565] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:13.336 [2024-07-15 22:46:58.130034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:13.336 [2024-07-15 22:46:58.198936] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:13.336 [2024-07-15 22:46:58.198970] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:14.282 22:46:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:14.282 22:46:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:14.282 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:14.850 [2024-07-15 22:46:59.580378] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:14.850 [2024-07-15 22:46:59.580423] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:14.850 [2024-07-15 22:46:59.580434] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:14.850 [2024-07-15 22:46:59.580446] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:14.850 [2024-07-15 22:46:59.580455] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:14.850 [2024-07-15 22:46:59.580466] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:14.850 [2024-07-15 22:46:59.580480] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:14.850 [2024-07-15 22:46:59.580491] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.850 22:46:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:15.418 22:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.418 "name": "Existed_Raid", 00:18:15.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.418 "strip_size_kb": 64, 00:18:15.418 "state": "configuring", 00:18:15.418 "raid_level": "raid0", 00:18:15.418 "superblock": false, 00:18:15.418 "num_base_bdevs": 4, 00:18:15.418 "num_base_bdevs_discovered": 0, 00:18:15.418 "num_base_bdevs_operational": 4, 00:18:15.418 "base_bdevs_list": [ 00:18:15.418 { 00:18:15.418 "name": "BaseBdev1", 00:18:15.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.418 "is_configured": false, 00:18:15.418 "data_offset": 0, 00:18:15.418 "data_size": 0 00:18:15.418 }, 00:18:15.418 { 00:18:15.418 "name": "BaseBdev2", 00:18:15.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.418 "is_configured": false, 00:18:15.418 "data_offset": 0, 00:18:15.418 "data_size": 0 00:18:15.418 }, 00:18:15.418 { 00:18:15.418 "name": "BaseBdev3", 00:18:15.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.418 "is_configured": false, 00:18:15.418 "data_offset": 0, 00:18:15.418 "data_size": 0 00:18:15.418 }, 00:18:15.418 { 00:18:15.418 "name": "BaseBdev4", 00:18:15.418 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.418 "is_configured": false, 00:18:15.418 "data_offset": 0, 00:18:15.418 "data_size": 0 00:18:15.418 } 00:18:15.418 ] 00:18:15.418 }' 00:18:15.418 22:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.418 22:47:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.986 22:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:16.245 [2024-07-15 22:47:00.963883] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:16.245 [2024-07-15 22:47:00.963918] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbdeaa0 name Existed_Raid, state configuring 00:18:16.245 22:47:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:16.504 [2024-07-15 22:47:01.212550] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:16.504 [2024-07-15 22:47:01.212582] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:16.504 [2024-07-15 22:47:01.212592] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:16.504 [2024-07-15 22:47:01.212604] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:16.504 [2024-07-15 22:47:01.212613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:16.504 [2024-07-15 22:47:01.212637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:16.504 [2024-07-15 22:47:01.212647] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:16.504 [2024-07-15 22:47:01.212658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:16.504 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:16.764 [2024-07-15 22:47:01.471113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:16.764 BaseBdev1 00:18:16.764 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:16.764 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:16.764 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:16.764 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:16.764 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:16.764 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:16.764 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:17.023 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:17.283 [ 00:18:17.283 { 00:18:17.283 "name": "BaseBdev1", 00:18:17.283 "aliases": [ 00:18:17.283 "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4" 00:18:17.283 ], 00:18:17.283 "product_name": "Malloc disk", 00:18:17.283 "block_size": 512, 00:18:17.283 "num_blocks": 65536, 00:18:17.283 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:17.283 "assigned_rate_limits": { 00:18:17.283 "rw_ios_per_sec": 0, 00:18:17.283 "rw_mbytes_per_sec": 0, 00:18:17.283 "r_mbytes_per_sec": 0, 00:18:17.283 "w_mbytes_per_sec": 0 00:18:17.283 }, 00:18:17.283 "claimed": true, 00:18:17.283 "claim_type": "exclusive_write", 00:18:17.283 "zoned": false, 00:18:17.283 "supported_io_types": { 00:18:17.283 "read": true, 00:18:17.283 "write": true, 00:18:17.283 "unmap": true, 00:18:17.283 "flush": true, 00:18:17.283 "reset": true, 00:18:17.283 "nvme_admin": false, 00:18:17.283 "nvme_io": false, 00:18:17.283 "nvme_io_md": false, 00:18:17.283 "write_zeroes": true, 00:18:17.283 "zcopy": true, 00:18:17.283 "get_zone_info": false, 00:18:17.283 "zone_management": false, 00:18:17.283 "zone_append": false, 00:18:17.283 "compare": false, 00:18:17.283 "compare_and_write": false, 00:18:17.283 "abort": true, 00:18:17.283 "seek_hole": false, 00:18:17.283 "seek_data": false, 00:18:17.283 "copy": true, 00:18:17.283 "nvme_iov_md": false 00:18:17.283 }, 00:18:17.283 "memory_domains": [ 00:18:17.283 { 00:18:17.283 "dma_device_id": "system", 00:18:17.283 "dma_device_type": 1 00:18:17.283 }, 00:18:17.283 { 00:18:17.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:17.283 "dma_device_type": 2 00:18:17.283 } 00:18:17.283 ], 00:18:17.283 "driver_specific": {} 00:18:17.283 } 00:18:17.283 ] 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.283 22:47:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:17.541 22:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:17.541 "name": "Existed_Raid", 00:18:17.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.541 "strip_size_kb": 64, 00:18:17.541 "state": "configuring", 00:18:17.541 "raid_level": "raid0", 00:18:17.541 "superblock": false, 00:18:17.541 "num_base_bdevs": 4, 00:18:17.541 "num_base_bdevs_discovered": 1, 00:18:17.541 "num_base_bdevs_operational": 4, 00:18:17.541 "base_bdevs_list": [ 00:18:17.541 { 00:18:17.541 "name": "BaseBdev1", 00:18:17.541 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:17.541 "is_configured": true, 00:18:17.541 "data_offset": 0, 00:18:17.541 "data_size": 65536 00:18:17.541 }, 00:18:17.541 { 00:18:17.541 "name": "BaseBdev2", 00:18:17.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.541 "is_configured": false, 00:18:17.541 "data_offset": 0, 00:18:17.541 "data_size": 0 00:18:17.541 }, 00:18:17.541 { 00:18:17.541 "name": "BaseBdev3", 00:18:17.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.541 "is_configured": false, 00:18:17.541 "data_offset": 0, 00:18:17.541 "data_size": 0 00:18:17.541 }, 00:18:17.541 { 00:18:17.541 "name": "BaseBdev4", 00:18:17.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.541 "is_configured": false, 00:18:17.541 "data_offset": 0, 00:18:17.541 "data_size": 0 00:18:17.541 } 00:18:17.541 ] 00:18:17.541 }' 00:18:17.541 22:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:17.541 22:47:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.108 22:47:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:18.366 [2024-07-15 22:47:03.023203] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:18.366 [2024-07-15 22:47:03.023244] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbde310 name Existed_Raid, state configuring 00:18:18.366 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:18.366 [2024-07-15 22:47:03.271894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:18.366 [2024-07-15 22:47:03.273333] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:18.366 [2024-07-15 22:47:03.273364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:18.366 [2024-07-15 22:47:03.273375] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:18.366 [2024-07-15 22:47:03.273387] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:18.366 [2024-07-15 22:47:03.273396] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:18.366 [2024-07-15 22:47:03.273407] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.624 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:18.882 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:18.882 "name": "Existed_Raid", 00:18:18.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:18.882 "strip_size_kb": 64, 00:18:18.882 "state": "configuring", 00:18:18.882 "raid_level": "raid0", 00:18:18.882 "superblock": false, 00:18:18.882 "num_base_bdevs": 4, 00:18:18.882 "num_base_bdevs_discovered": 1, 00:18:18.882 "num_base_bdevs_operational": 4, 00:18:18.882 "base_bdevs_list": [ 00:18:18.882 { 00:18:18.882 "name": "BaseBdev1", 00:18:18.882 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:18.882 "is_configured": true, 00:18:18.882 "data_offset": 0, 00:18:18.882 "data_size": 65536 00:18:18.882 }, 00:18:18.882 { 00:18:18.882 "name": "BaseBdev2", 00:18:18.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:18.882 "is_configured": false, 00:18:18.882 "data_offset": 0, 00:18:18.882 "data_size": 0 00:18:18.882 }, 00:18:18.882 { 00:18:18.882 "name": "BaseBdev3", 00:18:18.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:18.882 "is_configured": false, 00:18:18.882 "data_offset": 0, 00:18:18.882 "data_size": 0 00:18:18.882 }, 00:18:18.882 { 00:18:18.882 "name": "BaseBdev4", 00:18:18.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:18.882 "is_configured": false, 00:18:18.882 "data_offset": 0, 00:18:18.882 "data_size": 0 00:18:18.882 } 00:18:18.882 ] 00:18:18.882 }' 00:18:18.882 22:47:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:18.882 22:47:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.457 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:19.715 [2024-07-15 22:47:04.370180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:19.715 BaseBdev2 00:18:19.715 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:19.715 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:19.715 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:19.715 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:19.715 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:19.715 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:19.715 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:19.974 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:19.974 [ 00:18:19.974 { 00:18:19.974 "name": "BaseBdev2", 00:18:19.974 "aliases": [ 00:18:19.974 "42f56c6a-9b80-4fdd-a980-17631c601dc7" 00:18:19.974 ], 00:18:19.974 "product_name": "Malloc disk", 00:18:19.974 "block_size": 512, 00:18:19.974 "num_blocks": 65536, 00:18:19.974 "uuid": "42f56c6a-9b80-4fdd-a980-17631c601dc7", 00:18:19.974 "assigned_rate_limits": { 00:18:19.974 "rw_ios_per_sec": 0, 00:18:19.974 "rw_mbytes_per_sec": 0, 00:18:19.974 "r_mbytes_per_sec": 0, 00:18:19.974 "w_mbytes_per_sec": 0 00:18:19.974 }, 00:18:19.974 "claimed": true, 00:18:19.974 "claim_type": "exclusive_write", 00:18:19.974 "zoned": false, 00:18:19.974 "supported_io_types": { 00:18:19.974 "read": true, 00:18:19.974 "write": true, 00:18:19.974 "unmap": true, 00:18:19.974 "flush": true, 00:18:19.974 "reset": true, 00:18:19.974 "nvme_admin": false, 00:18:19.974 "nvme_io": false, 00:18:19.974 "nvme_io_md": false, 00:18:19.974 "write_zeroes": true, 00:18:19.974 "zcopy": true, 00:18:19.974 "get_zone_info": false, 00:18:19.974 "zone_management": false, 00:18:19.974 "zone_append": false, 00:18:19.974 "compare": false, 00:18:19.974 "compare_and_write": false, 00:18:19.974 "abort": true, 00:18:19.974 "seek_hole": false, 00:18:19.974 "seek_data": false, 00:18:19.974 "copy": true, 00:18:19.974 "nvme_iov_md": false 00:18:19.974 }, 00:18:19.974 "memory_domains": [ 00:18:19.974 { 00:18:19.974 "dma_device_id": "system", 00:18:19.974 "dma_device_type": 1 00:18:19.974 }, 00:18:19.974 { 00:18:19.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.974 "dma_device_type": 2 00:18:19.974 } 00:18:19.974 ], 00:18:19.974 "driver_specific": {} 00:18:19.974 } 00:18:19.974 ] 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.233 22:47:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:20.233 22:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.233 "name": "Existed_Raid", 00:18:20.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.233 "strip_size_kb": 64, 00:18:20.233 "state": "configuring", 00:18:20.233 "raid_level": "raid0", 00:18:20.233 "superblock": false, 00:18:20.233 "num_base_bdevs": 4, 00:18:20.233 "num_base_bdevs_discovered": 2, 00:18:20.233 "num_base_bdevs_operational": 4, 00:18:20.233 "base_bdevs_list": [ 00:18:20.233 { 00:18:20.233 "name": "BaseBdev1", 00:18:20.233 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:20.233 "is_configured": true, 00:18:20.233 "data_offset": 0, 00:18:20.233 "data_size": 65536 00:18:20.233 }, 00:18:20.233 { 00:18:20.233 "name": "BaseBdev2", 00:18:20.233 "uuid": "42f56c6a-9b80-4fdd-a980-17631c601dc7", 00:18:20.233 "is_configured": true, 00:18:20.233 "data_offset": 0, 00:18:20.233 "data_size": 65536 00:18:20.233 }, 00:18:20.233 { 00:18:20.233 "name": "BaseBdev3", 00:18:20.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.233 "is_configured": false, 00:18:20.233 "data_offset": 0, 00:18:20.233 "data_size": 0 00:18:20.233 }, 00:18:20.233 { 00:18:20.233 "name": "BaseBdev4", 00:18:20.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.233 "is_configured": false, 00:18:20.233 "data_offset": 0, 00:18:20.233 "data_size": 0 00:18:20.233 } 00:18:20.233 ] 00:18:20.233 }' 00:18:20.233 22:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.233 22:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:21.163 [2024-07-15 22:47:05.965862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:21.163 BaseBdev3 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:21.163 22:47:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:21.421 22:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:21.986 [ 00:18:21.986 { 00:18:21.986 "name": "BaseBdev3", 00:18:21.986 "aliases": [ 00:18:21.986 "040d481d-c9f0-40b8-b36d-57e76d984ebf" 00:18:21.986 ], 00:18:21.986 "product_name": "Malloc disk", 00:18:21.986 "block_size": 512, 00:18:21.986 "num_blocks": 65536, 00:18:21.986 "uuid": "040d481d-c9f0-40b8-b36d-57e76d984ebf", 00:18:21.986 "assigned_rate_limits": { 00:18:21.986 "rw_ios_per_sec": 0, 00:18:21.986 "rw_mbytes_per_sec": 0, 00:18:21.986 "r_mbytes_per_sec": 0, 00:18:21.986 "w_mbytes_per_sec": 0 00:18:21.986 }, 00:18:21.986 "claimed": true, 00:18:21.986 "claim_type": "exclusive_write", 00:18:21.986 "zoned": false, 00:18:21.986 "supported_io_types": { 00:18:21.986 "read": true, 00:18:21.986 "write": true, 00:18:21.986 "unmap": true, 00:18:21.986 "flush": true, 00:18:21.986 "reset": true, 00:18:21.986 "nvme_admin": false, 00:18:21.986 "nvme_io": false, 00:18:21.986 "nvme_io_md": false, 00:18:21.986 "write_zeroes": true, 00:18:21.986 "zcopy": true, 00:18:21.986 "get_zone_info": false, 00:18:21.986 "zone_management": false, 00:18:21.986 "zone_append": false, 00:18:21.986 "compare": false, 00:18:21.986 "compare_and_write": false, 00:18:21.986 "abort": true, 00:18:21.986 "seek_hole": false, 00:18:21.986 "seek_data": false, 00:18:21.986 "copy": true, 00:18:21.986 "nvme_iov_md": false 00:18:21.986 }, 00:18:21.986 "memory_domains": [ 00:18:21.986 { 00:18:21.986 "dma_device_id": "system", 00:18:21.986 "dma_device_type": 1 00:18:21.986 }, 00:18:21.986 { 00:18:21.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.986 "dma_device_type": 2 00:18:21.986 } 00:18:21.986 ], 00:18:21.986 "driver_specific": {} 00:18:21.986 } 00:18:21.986 ] 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.986 22:47:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:22.244 22:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.244 "name": "Existed_Raid", 00:18:22.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.244 "strip_size_kb": 64, 00:18:22.244 "state": "configuring", 00:18:22.244 "raid_level": "raid0", 00:18:22.244 "superblock": false, 00:18:22.244 "num_base_bdevs": 4, 00:18:22.244 "num_base_bdevs_discovered": 3, 00:18:22.244 "num_base_bdevs_operational": 4, 00:18:22.244 "base_bdevs_list": [ 00:18:22.244 { 00:18:22.244 "name": "BaseBdev1", 00:18:22.244 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:22.244 "is_configured": true, 00:18:22.244 "data_offset": 0, 00:18:22.244 "data_size": 65536 00:18:22.244 }, 00:18:22.244 { 00:18:22.244 "name": "BaseBdev2", 00:18:22.244 "uuid": "42f56c6a-9b80-4fdd-a980-17631c601dc7", 00:18:22.244 "is_configured": true, 00:18:22.244 "data_offset": 0, 00:18:22.244 "data_size": 65536 00:18:22.244 }, 00:18:22.244 { 00:18:22.244 "name": "BaseBdev3", 00:18:22.244 "uuid": "040d481d-c9f0-40b8-b36d-57e76d984ebf", 00:18:22.244 "is_configured": true, 00:18:22.244 "data_offset": 0, 00:18:22.244 "data_size": 65536 00:18:22.244 }, 00:18:22.244 { 00:18:22.244 "name": "BaseBdev4", 00:18:22.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.244 "is_configured": false, 00:18:22.244 "data_offset": 0, 00:18:22.244 "data_size": 0 00:18:22.244 } 00:18:22.244 ] 00:18:22.244 }' 00:18:22.244 22:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.244 22:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.811 22:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:23.070 [2024-07-15 22:47:07.834279] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:23.070 [2024-07-15 22:47:07.834318] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbdf350 00:18:23.070 [2024-07-15 22:47:07.834327] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:23.070 [2024-07-15 22:47:07.834578] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbdf020 00:18:23.070 [2024-07-15 22:47:07.834698] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbdf350 00:18:23.070 [2024-07-15 22:47:07.834708] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbdf350 00:18:23.070 [2024-07-15 22:47:07.834873] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:23.070 BaseBdev4 00:18:23.070 22:47:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:23.070 22:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:23.070 22:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:23.070 22:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:23.070 22:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:23.070 22:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:23.070 22:47:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:23.328 22:47:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:23.587 [ 00:18:23.587 { 00:18:23.587 "name": "BaseBdev4", 00:18:23.587 "aliases": [ 00:18:23.587 "f7f129ea-b5f4-4305-94ca-4bb10b922a49" 00:18:23.587 ], 00:18:23.587 "product_name": "Malloc disk", 00:18:23.587 "block_size": 512, 00:18:23.587 "num_blocks": 65536, 00:18:23.587 "uuid": "f7f129ea-b5f4-4305-94ca-4bb10b922a49", 00:18:23.587 "assigned_rate_limits": { 00:18:23.587 "rw_ios_per_sec": 0, 00:18:23.587 "rw_mbytes_per_sec": 0, 00:18:23.587 "r_mbytes_per_sec": 0, 00:18:23.587 "w_mbytes_per_sec": 0 00:18:23.587 }, 00:18:23.587 "claimed": true, 00:18:23.587 "claim_type": "exclusive_write", 00:18:23.587 "zoned": false, 00:18:23.587 "supported_io_types": { 00:18:23.587 "read": true, 00:18:23.587 "write": true, 00:18:23.587 "unmap": true, 00:18:23.587 "flush": true, 00:18:23.587 "reset": true, 00:18:23.587 "nvme_admin": false, 00:18:23.587 "nvme_io": false, 00:18:23.587 "nvme_io_md": false, 00:18:23.587 "write_zeroes": true, 00:18:23.587 "zcopy": true, 00:18:23.587 "get_zone_info": false, 00:18:23.587 "zone_management": false, 00:18:23.587 "zone_append": false, 00:18:23.587 "compare": false, 00:18:23.587 "compare_and_write": false, 00:18:23.587 "abort": true, 00:18:23.587 "seek_hole": false, 00:18:23.587 "seek_data": false, 00:18:23.587 "copy": true, 00:18:23.587 "nvme_iov_md": false 00:18:23.587 }, 00:18:23.587 "memory_domains": [ 00:18:23.587 { 00:18:23.587 "dma_device_id": "system", 00:18:23.587 "dma_device_type": 1 00:18:23.587 }, 00:18:23.587 { 00:18:23.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.587 "dma_device_type": 2 00:18:23.587 } 00:18:23.587 ], 00:18:23.587 "driver_specific": {} 00:18:23.587 } 00:18:23.587 ] 00:18:23.587 22:47:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:23.587 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:23.587 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:23.587 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:23.587 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:23.587 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.588 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:23.846 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:23.846 "name": "Existed_Raid", 00:18:23.846 "uuid": "53aaa096-29eb-4020-b4ed-02aaeba99553", 00:18:23.846 "strip_size_kb": 64, 00:18:23.846 "state": "online", 00:18:23.846 "raid_level": "raid0", 00:18:23.846 "superblock": false, 00:18:23.846 "num_base_bdevs": 4, 00:18:23.846 "num_base_bdevs_discovered": 4, 00:18:23.846 "num_base_bdevs_operational": 4, 00:18:23.846 "base_bdevs_list": [ 00:18:23.846 { 00:18:23.846 "name": "BaseBdev1", 00:18:23.846 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:23.846 "is_configured": true, 00:18:23.846 "data_offset": 0, 00:18:23.846 "data_size": 65536 00:18:23.846 }, 00:18:23.846 { 00:18:23.846 "name": "BaseBdev2", 00:18:23.846 "uuid": "42f56c6a-9b80-4fdd-a980-17631c601dc7", 00:18:23.846 "is_configured": true, 00:18:23.846 "data_offset": 0, 00:18:23.846 "data_size": 65536 00:18:23.846 }, 00:18:23.846 { 00:18:23.846 "name": "BaseBdev3", 00:18:23.846 "uuid": "040d481d-c9f0-40b8-b36d-57e76d984ebf", 00:18:23.846 "is_configured": true, 00:18:23.846 "data_offset": 0, 00:18:23.846 "data_size": 65536 00:18:23.846 }, 00:18:23.846 { 00:18:23.846 "name": "BaseBdev4", 00:18:23.846 "uuid": "f7f129ea-b5f4-4305-94ca-4bb10b922a49", 00:18:23.846 "is_configured": true, 00:18:23.846 "data_offset": 0, 00:18:23.846 "data_size": 65536 00:18:23.846 } 00:18:23.846 ] 00:18:23.846 }' 00:18:23.846 22:47:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:23.846 22:47:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:24.413 [2024-07-15 22:47:09.290491] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:24.413 "name": "Existed_Raid", 00:18:24.413 "aliases": [ 00:18:24.413 "53aaa096-29eb-4020-b4ed-02aaeba99553" 00:18:24.413 ], 00:18:24.413 "product_name": "Raid Volume", 00:18:24.413 "block_size": 512, 00:18:24.413 "num_blocks": 262144, 00:18:24.413 "uuid": "53aaa096-29eb-4020-b4ed-02aaeba99553", 00:18:24.413 "assigned_rate_limits": { 00:18:24.413 "rw_ios_per_sec": 0, 00:18:24.413 "rw_mbytes_per_sec": 0, 00:18:24.413 "r_mbytes_per_sec": 0, 00:18:24.413 "w_mbytes_per_sec": 0 00:18:24.413 }, 00:18:24.413 "claimed": false, 00:18:24.413 "zoned": false, 00:18:24.413 "supported_io_types": { 00:18:24.413 "read": true, 00:18:24.413 "write": true, 00:18:24.413 "unmap": true, 00:18:24.413 "flush": true, 00:18:24.413 "reset": true, 00:18:24.413 "nvme_admin": false, 00:18:24.413 "nvme_io": false, 00:18:24.413 "nvme_io_md": false, 00:18:24.413 "write_zeroes": true, 00:18:24.413 "zcopy": false, 00:18:24.413 "get_zone_info": false, 00:18:24.413 "zone_management": false, 00:18:24.413 "zone_append": false, 00:18:24.413 "compare": false, 00:18:24.413 "compare_and_write": false, 00:18:24.413 "abort": false, 00:18:24.413 "seek_hole": false, 00:18:24.413 "seek_data": false, 00:18:24.413 "copy": false, 00:18:24.413 "nvme_iov_md": false 00:18:24.413 }, 00:18:24.413 "memory_domains": [ 00:18:24.413 { 00:18:24.413 "dma_device_id": "system", 00:18:24.413 "dma_device_type": 1 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.413 "dma_device_type": 2 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "dma_device_id": "system", 00:18:24.413 "dma_device_type": 1 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.413 "dma_device_type": 2 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "dma_device_id": "system", 00:18:24.413 "dma_device_type": 1 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.413 "dma_device_type": 2 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "dma_device_id": "system", 00:18:24.413 "dma_device_type": 1 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.413 "dma_device_type": 2 00:18:24.413 } 00:18:24.413 ], 00:18:24.413 "driver_specific": { 00:18:24.413 "raid": { 00:18:24.413 "uuid": "53aaa096-29eb-4020-b4ed-02aaeba99553", 00:18:24.413 "strip_size_kb": 64, 00:18:24.413 "state": "online", 00:18:24.413 "raid_level": "raid0", 00:18:24.413 "superblock": false, 00:18:24.413 "num_base_bdevs": 4, 00:18:24.413 "num_base_bdevs_discovered": 4, 00:18:24.413 "num_base_bdevs_operational": 4, 00:18:24.413 "base_bdevs_list": [ 00:18:24.413 { 00:18:24.413 "name": "BaseBdev1", 00:18:24.413 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:24.413 "is_configured": true, 00:18:24.413 "data_offset": 0, 00:18:24.413 "data_size": 65536 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "name": "BaseBdev2", 00:18:24.413 "uuid": "42f56c6a-9b80-4fdd-a980-17631c601dc7", 00:18:24.413 "is_configured": true, 00:18:24.413 "data_offset": 0, 00:18:24.413 "data_size": 65536 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "name": "BaseBdev3", 00:18:24.413 "uuid": "040d481d-c9f0-40b8-b36d-57e76d984ebf", 00:18:24.413 "is_configured": true, 00:18:24.413 "data_offset": 0, 00:18:24.413 "data_size": 65536 00:18:24.413 }, 00:18:24.413 { 00:18:24.413 "name": "BaseBdev4", 00:18:24.413 "uuid": "f7f129ea-b5f4-4305-94ca-4bb10b922a49", 00:18:24.413 "is_configured": true, 00:18:24.413 "data_offset": 0, 00:18:24.413 "data_size": 65536 00:18:24.413 } 00:18:24.413 ] 00:18:24.413 } 00:18:24.413 } 00:18:24.413 }' 00:18:24.413 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:24.672 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:24.672 BaseBdev2 00:18:24.672 BaseBdev3 00:18:24.672 BaseBdev4' 00:18:24.672 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:24.672 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:24.672 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:24.930 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:24.930 "name": "BaseBdev1", 00:18:24.930 "aliases": [ 00:18:24.930 "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4" 00:18:24.930 ], 00:18:24.930 "product_name": "Malloc disk", 00:18:24.930 "block_size": 512, 00:18:24.930 "num_blocks": 65536, 00:18:24.930 "uuid": "843202d2-c9fb-4e3b-bc4d-2f5b22a9eed4", 00:18:24.930 "assigned_rate_limits": { 00:18:24.930 "rw_ios_per_sec": 0, 00:18:24.930 "rw_mbytes_per_sec": 0, 00:18:24.930 "r_mbytes_per_sec": 0, 00:18:24.930 "w_mbytes_per_sec": 0 00:18:24.930 }, 00:18:24.930 "claimed": true, 00:18:24.930 "claim_type": "exclusive_write", 00:18:24.930 "zoned": false, 00:18:24.930 "supported_io_types": { 00:18:24.930 "read": true, 00:18:24.930 "write": true, 00:18:24.930 "unmap": true, 00:18:24.930 "flush": true, 00:18:24.930 "reset": true, 00:18:24.930 "nvme_admin": false, 00:18:24.930 "nvme_io": false, 00:18:24.930 "nvme_io_md": false, 00:18:24.930 "write_zeroes": true, 00:18:24.930 "zcopy": true, 00:18:24.930 "get_zone_info": false, 00:18:24.930 "zone_management": false, 00:18:24.930 "zone_append": false, 00:18:24.930 "compare": false, 00:18:24.930 "compare_and_write": false, 00:18:24.930 "abort": true, 00:18:24.930 "seek_hole": false, 00:18:24.930 "seek_data": false, 00:18:24.930 "copy": true, 00:18:24.930 "nvme_iov_md": false 00:18:24.931 }, 00:18:24.931 "memory_domains": [ 00:18:24.931 { 00:18:24.931 "dma_device_id": "system", 00:18:24.931 "dma_device_type": 1 00:18:24.931 }, 00:18:24.931 { 00:18:24.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.931 "dma_device_type": 2 00:18:24.931 } 00:18:24.931 ], 00:18:24.931 "driver_specific": {} 00:18:24.931 }' 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.931 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.188 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:25.188 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.188 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.188 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:25.188 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:25.188 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:25.188 22:47:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:25.446 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:25.446 "name": "BaseBdev2", 00:18:25.446 "aliases": [ 00:18:25.446 "42f56c6a-9b80-4fdd-a980-17631c601dc7" 00:18:25.446 ], 00:18:25.446 "product_name": "Malloc disk", 00:18:25.446 "block_size": 512, 00:18:25.446 "num_blocks": 65536, 00:18:25.446 "uuid": "42f56c6a-9b80-4fdd-a980-17631c601dc7", 00:18:25.446 "assigned_rate_limits": { 00:18:25.446 "rw_ios_per_sec": 0, 00:18:25.446 "rw_mbytes_per_sec": 0, 00:18:25.446 "r_mbytes_per_sec": 0, 00:18:25.446 "w_mbytes_per_sec": 0 00:18:25.446 }, 00:18:25.446 "claimed": true, 00:18:25.446 "claim_type": "exclusive_write", 00:18:25.446 "zoned": false, 00:18:25.446 "supported_io_types": { 00:18:25.446 "read": true, 00:18:25.446 "write": true, 00:18:25.446 "unmap": true, 00:18:25.446 "flush": true, 00:18:25.446 "reset": true, 00:18:25.446 "nvme_admin": false, 00:18:25.446 "nvme_io": false, 00:18:25.446 "nvme_io_md": false, 00:18:25.446 "write_zeroes": true, 00:18:25.446 "zcopy": true, 00:18:25.446 "get_zone_info": false, 00:18:25.446 "zone_management": false, 00:18:25.446 "zone_append": false, 00:18:25.446 "compare": false, 00:18:25.446 "compare_and_write": false, 00:18:25.446 "abort": true, 00:18:25.446 "seek_hole": false, 00:18:25.446 "seek_data": false, 00:18:25.446 "copy": true, 00:18:25.446 "nvme_iov_md": false 00:18:25.447 }, 00:18:25.447 "memory_domains": [ 00:18:25.447 { 00:18:25.447 "dma_device_id": "system", 00:18:25.447 "dma_device_type": 1 00:18:25.447 }, 00:18:25.447 { 00:18:25.447 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.447 "dma_device_type": 2 00:18:25.447 } 00:18:25.447 ], 00:18:25.447 "driver_specific": {} 00:18:25.447 }' 00:18:25.447 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.447 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.447 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:25.447 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.447 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:25.720 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:25.721 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:25.721 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:25.980 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:25.980 "name": "BaseBdev3", 00:18:25.980 "aliases": [ 00:18:25.980 "040d481d-c9f0-40b8-b36d-57e76d984ebf" 00:18:25.980 ], 00:18:25.980 "product_name": "Malloc disk", 00:18:25.980 "block_size": 512, 00:18:25.980 "num_blocks": 65536, 00:18:25.980 "uuid": "040d481d-c9f0-40b8-b36d-57e76d984ebf", 00:18:25.980 "assigned_rate_limits": { 00:18:25.980 "rw_ios_per_sec": 0, 00:18:25.980 "rw_mbytes_per_sec": 0, 00:18:25.980 "r_mbytes_per_sec": 0, 00:18:25.980 "w_mbytes_per_sec": 0 00:18:25.980 }, 00:18:25.980 "claimed": true, 00:18:25.980 "claim_type": "exclusive_write", 00:18:25.980 "zoned": false, 00:18:25.980 "supported_io_types": { 00:18:25.980 "read": true, 00:18:25.980 "write": true, 00:18:25.980 "unmap": true, 00:18:25.980 "flush": true, 00:18:25.980 "reset": true, 00:18:25.980 "nvme_admin": false, 00:18:25.980 "nvme_io": false, 00:18:25.980 "nvme_io_md": false, 00:18:25.980 "write_zeroes": true, 00:18:25.980 "zcopy": true, 00:18:25.980 "get_zone_info": false, 00:18:25.980 "zone_management": false, 00:18:25.980 "zone_append": false, 00:18:25.980 "compare": false, 00:18:25.980 "compare_and_write": false, 00:18:25.980 "abort": true, 00:18:25.980 "seek_hole": false, 00:18:25.980 "seek_data": false, 00:18:25.980 "copy": true, 00:18:25.980 "nvme_iov_md": false 00:18:25.980 }, 00:18:25.980 "memory_domains": [ 00:18:25.980 { 00:18:25.980 "dma_device_id": "system", 00:18:25.980 "dma_device_type": 1 00:18:25.980 }, 00:18:25.980 { 00:18:25.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:25.980 "dma_device_type": 2 00:18:25.980 } 00:18:25.980 ], 00:18:25.980 "driver_specific": {} 00:18:25.980 }' 00:18:25.980 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:25.980 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:26.238 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:26.238 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:26.238 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:26.238 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:26.238 22:47:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:26.238 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:26.238 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:26.238 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:26.238 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:26.497 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:26.497 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:26.497 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:26.497 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:26.755 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:26.755 "name": "BaseBdev4", 00:18:26.755 "aliases": [ 00:18:26.755 "f7f129ea-b5f4-4305-94ca-4bb10b922a49" 00:18:26.755 ], 00:18:26.755 "product_name": "Malloc disk", 00:18:26.755 "block_size": 512, 00:18:26.755 "num_blocks": 65536, 00:18:26.755 "uuid": "f7f129ea-b5f4-4305-94ca-4bb10b922a49", 00:18:26.755 "assigned_rate_limits": { 00:18:26.755 "rw_ios_per_sec": 0, 00:18:26.755 "rw_mbytes_per_sec": 0, 00:18:26.755 "r_mbytes_per_sec": 0, 00:18:26.755 "w_mbytes_per_sec": 0 00:18:26.755 }, 00:18:26.755 "claimed": true, 00:18:26.755 "claim_type": "exclusive_write", 00:18:26.755 "zoned": false, 00:18:26.755 "supported_io_types": { 00:18:26.755 "read": true, 00:18:26.755 "write": true, 00:18:26.755 "unmap": true, 00:18:26.755 "flush": true, 00:18:26.755 "reset": true, 00:18:26.755 "nvme_admin": false, 00:18:26.755 "nvme_io": false, 00:18:26.755 "nvme_io_md": false, 00:18:26.755 "write_zeroes": true, 00:18:26.755 "zcopy": true, 00:18:26.755 "get_zone_info": false, 00:18:26.755 "zone_management": false, 00:18:26.755 "zone_append": false, 00:18:26.755 "compare": false, 00:18:26.755 "compare_and_write": false, 00:18:26.755 "abort": true, 00:18:26.755 "seek_hole": false, 00:18:26.755 "seek_data": false, 00:18:26.755 "copy": true, 00:18:26.755 "nvme_iov_md": false 00:18:26.755 }, 00:18:26.755 "memory_domains": [ 00:18:26.755 { 00:18:26.755 "dma_device_id": "system", 00:18:26.755 "dma_device_type": 1 00:18:26.755 }, 00:18:26.755 { 00:18:26.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.755 "dma_device_type": 2 00:18:26.755 } 00:18:26.755 ], 00:18:26.755 "driver_specific": {} 00:18:26.755 }' 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:26.756 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:27.015 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:27.015 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:27.015 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:27.015 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:27.015 22:47:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:27.583 [2024-07-15 22:47:12.258102] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:27.583 [2024-07-15 22:47:12.258131] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:27.583 [2024-07-15 22:47:12.258180] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.583 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.842 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.842 "name": "Existed_Raid", 00:18:27.842 "uuid": "53aaa096-29eb-4020-b4ed-02aaeba99553", 00:18:27.842 "strip_size_kb": 64, 00:18:27.842 "state": "offline", 00:18:27.842 "raid_level": "raid0", 00:18:27.842 "superblock": false, 00:18:27.842 "num_base_bdevs": 4, 00:18:27.842 "num_base_bdevs_discovered": 3, 00:18:27.842 "num_base_bdevs_operational": 3, 00:18:27.842 "base_bdevs_list": [ 00:18:27.842 { 00:18:27.842 "name": null, 00:18:27.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.842 "is_configured": false, 00:18:27.842 "data_offset": 0, 00:18:27.842 "data_size": 65536 00:18:27.842 }, 00:18:27.842 { 00:18:27.842 "name": "BaseBdev2", 00:18:27.842 "uuid": "42f56c6a-9b80-4fdd-a980-17631c601dc7", 00:18:27.842 "is_configured": true, 00:18:27.842 "data_offset": 0, 00:18:27.842 "data_size": 65536 00:18:27.842 }, 00:18:27.842 { 00:18:27.842 "name": "BaseBdev3", 00:18:27.842 "uuid": "040d481d-c9f0-40b8-b36d-57e76d984ebf", 00:18:27.842 "is_configured": true, 00:18:27.842 "data_offset": 0, 00:18:27.842 "data_size": 65536 00:18:27.842 }, 00:18:27.842 { 00:18:27.842 "name": "BaseBdev4", 00:18:27.842 "uuid": "f7f129ea-b5f4-4305-94ca-4bb10b922a49", 00:18:27.842 "is_configured": true, 00:18:27.842 "data_offset": 0, 00:18:27.842 "data_size": 65536 00:18:27.842 } 00:18:27.842 ] 00:18:27.842 }' 00:18:27.842 22:47:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.842 22:47:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.779 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:28.779 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:28.779 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.779 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:28.779 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:28.779 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:28.779 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:29.038 [2024-07-15 22:47:13.860285] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:29.038 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:29.038 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:29.038 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.038 22:47:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:29.296 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:29.296 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:29.296 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:29.555 [2024-07-15 22:47:14.369749] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:29.555 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:29.555 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:29.555 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.555 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:29.816 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:29.816 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:29.816 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:30.125 [2024-07-15 22:47:14.923677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:30.125 [2024-07-15 22:47:14.923723] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbdf350 name Existed_Raid, state offline 00:18:30.126 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:30.126 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:30.126 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.126 22:47:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:30.393 22:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:30.393 22:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:30.393 22:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:30.393 22:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:30.393 22:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:30.393 22:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:30.961 BaseBdev2 00:18:30.961 22:47:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:30.961 22:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:30.961 22:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:30.961 22:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:30.961 22:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:30.961 22:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:30.961 22:47:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:31.221 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:31.480 [ 00:18:31.480 { 00:18:31.480 "name": "BaseBdev2", 00:18:31.480 "aliases": [ 00:18:31.480 "35fbfc34-f2e8-47a4-874f-9449a786257f" 00:18:31.480 ], 00:18:31.480 "product_name": "Malloc disk", 00:18:31.480 "block_size": 512, 00:18:31.480 "num_blocks": 65536, 00:18:31.480 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:31.480 "assigned_rate_limits": { 00:18:31.480 "rw_ios_per_sec": 0, 00:18:31.480 "rw_mbytes_per_sec": 0, 00:18:31.480 "r_mbytes_per_sec": 0, 00:18:31.480 "w_mbytes_per_sec": 0 00:18:31.480 }, 00:18:31.480 "claimed": false, 00:18:31.480 "zoned": false, 00:18:31.480 "supported_io_types": { 00:18:31.480 "read": true, 00:18:31.480 "write": true, 00:18:31.480 "unmap": true, 00:18:31.480 "flush": true, 00:18:31.480 "reset": true, 00:18:31.480 "nvme_admin": false, 00:18:31.480 "nvme_io": false, 00:18:31.480 "nvme_io_md": false, 00:18:31.480 "write_zeroes": true, 00:18:31.480 "zcopy": true, 00:18:31.480 "get_zone_info": false, 00:18:31.480 "zone_management": false, 00:18:31.480 "zone_append": false, 00:18:31.480 "compare": false, 00:18:31.480 "compare_and_write": false, 00:18:31.480 "abort": true, 00:18:31.480 "seek_hole": false, 00:18:31.480 "seek_data": false, 00:18:31.480 "copy": true, 00:18:31.480 "nvme_iov_md": false 00:18:31.480 }, 00:18:31.480 "memory_domains": [ 00:18:31.480 { 00:18:31.480 "dma_device_id": "system", 00:18:31.480 "dma_device_type": 1 00:18:31.480 }, 00:18:31.480 { 00:18:31.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.480 "dma_device_type": 2 00:18:31.480 } 00:18:31.480 ], 00:18:31.480 "driver_specific": {} 00:18:31.480 } 00:18:31.480 ] 00:18:31.480 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:31.480 22:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:31.480 22:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:31.480 22:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:31.740 BaseBdev3 00:18:31.740 22:47:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:31.740 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:31.740 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:31.740 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:31.740 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:31.740 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:31.740 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:31.999 22:47:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:32.258 [ 00:18:32.258 { 00:18:32.258 "name": "BaseBdev3", 00:18:32.258 "aliases": [ 00:18:32.258 "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822" 00:18:32.258 ], 00:18:32.258 "product_name": "Malloc disk", 00:18:32.258 "block_size": 512, 00:18:32.258 "num_blocks": 65536, 00:18:32.258 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:32.258 "assigned_rate_limits": { 00:18:32.258 "rw_ios_per_sec": 0, 00:18:32.258 "rw_mbytes_per_sec": 0, 00:18:32.258 "r_mbytes_per_sec": 0, 00:18:32.258 "w_mbytes_per_sec": 0 00:18:32.258 }, 00:18:32.258 "claimed": false, 00:18:32.258 "zoned": false, 00:18:32.258 "supported_io_types": { 00:18:32.258 "read": true, 00:18:32.258 "write": true, 00:18:32.258 "unmap": true, 00:18:32.258 "flush": true, 00:18:32.258 "reset": true, 00:18:32.258 "nvme_admin": false, 00:18:32.258 "nvme_io": false, 00:18:32.258 "nvme_io_md": false, 00:18:32.258 "write_zeroes": true, 00:18:32.258 "zcopy": true, 00:18:32.258 "get_zone_info": false, 00:18:32.258 "zone_management": false, 00:18:32.258 "zone_append": false, 00:18:32.258 "compare": false, 00:18:32.258 "compare_and_write": false, 00:18:32.258 "abort": true, 00:18:32.258 "seek_hole": false, 00:18:32.258 "seek_data": false, 00:18:32.258 "copy": true, 00:18:32.258 "nvme_iov_md": false 00:18:32.258 }, 00:18:32.258 "memory_domains": [ 00:18:32.258 { 00:18:32.258 "dma_device_id": "system", 00:18:32.258 "dma_device_type": 1 00:18:32.258 }, 00:18:32.258 { 00:18:32.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.258 "dma_device_type": 2 00:18:32.258 } 00:18:32.258 ], 00:18:32.258 "driver_specific": {} 00:18:32.258 } 00:18:32.258 ] 00:18:32.258 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:32.258 22:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:32.258 22:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:32.258 22:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:32.518 BaseBdev4 00:18:32.518 22:47:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:32.518 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:32.518 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:32.518 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:32.518 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:32.518 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:32.518 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:33.086 22:47:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:33.346 [ 00:18:33.346 { 00:18:33.346 "name": "BaseBdev4", 00:18:33.346 "aliases": [ 00:18:33.346 "376f2fc1-bca2-4d32-9cea-dd1db437be97" 00:18:33.346 ], 00:18:33.346 "product_name": "Malloc disk", 00:18:33.346 "block_size": 512, 00:18:33.346 "num_blocks": 65536, 00:18:33.346 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:33.346 "assigned_rate_limits": { 00:18:33.346 "rw_ios_per_sec": 0, 00:18:33.346 "rw_mbytes_per_sec": 0, 00:18:33.346 "r_mbytes_per_sec": 0, 00:18:33.346 "w_mbytes_per_sec": 0 00:18:33.346 }, 00:18:33.346 "claimed": false, 00:18:33.346 "zoned": false, 00:18:33.346 "supported_io_types": { 00:18:33.346 "read": true, 00:18:33.346 "write": true, 00:18:33.346 "unmap": true, 00:18:33.346 "flush": true, 00:18:33.346 "reset": true, 00:18:33.346 "nvme_admin": false, 00:18:33.346 "nvme_io": false, 00:18:33.346 "nvme_io_md": false, 00:18:33.346 "write_zeroes": true, 00:18:33.346 "zcopy": true, 00:18:33.346 "get_zone_info": false, 00:18:33.346 "zone_management": false, 00:18:33.346 "zone_append": false, 00:18:33.346 "compare": false, 00:18:33.346 "compare_and_write": false, 00:18:33.346 "abort": true, 00:18:33.346 "seek_hole": false, 00:18:33.346 "seek_data": false, 00:18:33.346 "copy": true, 00:18:33.346 "nvme_iov_md": false 00:18:33.346 }, 00:18:33.346 "memory_domains": [ 00:18:33.346 { 00:18:33.346 "dma_device_id": "system", 00:18:33.346 "dma_device_type": 1 00:18:33.346 }, 00:18:33.346 { 00:18:33.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.346 "dma_device_type": 2 00:18:33.346 } 00:18:33.346 ], 00:18:33.346 "driver_specific": {} 00:18:33.346 } 00:18:33.346 ] 00:18:33.346 22:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:33.346 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:33.346 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:33.346 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:33.915 [2024-07-15 22:47:18.630582] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:33.915 [2024-07-15 22:47:18.630624] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:33.915 [2024-07-15 22:47:18.630644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:33.915 [2024-07-15 22:47:18.632022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:33.915 [2024-07-15 22:47:18.632065] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.915 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.174 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:34.174 "name": "Existed_Raid", 00:18:34.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:34.174 "strip_size_kb": 64, 00:18:34.174 "state": "configuring", 00:18:34.174 "raid_level": "raid0", 00:18:34.174 "superblock": false, 00:18:34.174 "num_base_bdevs": 4, 00:18:34.174 "num_base_bdevs_discovered": 3, 00:18:34.174 "num_base_bdevs_operational": 4, 00:18:34.174 "base_bdevs_list": [ 00:18:34.174 { 00:18:34.174 "name": "BaseBdev1", 00:18:34.174 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:34.175 "is_configured": false, 00:18:34.175 "data_offset": 0, 00:18:34.175 "data_size": 0 00:18:34.175 }, 00:18:34.175 { 00:18:34.175 "name": "BaseBdev2", 00:18:34.175 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:34.175 "is_configured": true, 00:18:34.175 "data_offset": 0, 00:18:34.175 "data_size": 65536 00:18:34.175 }, 00:18:34.175 { 00:18:34.175 "name": "BaseBdev3", 00:18:34.175 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:34.175 "is_configured": true, 00:18:34.175 "data_offset": 0, 00:18:34.175 "data_size": 65536 00:18:34.175 }, 00:18:34.175 { 00:18:34.175 "name": "BaseBdev4", 00:18:34.175 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:34.175 "is_configured": true, 00:18:34.175 "data_offset": 0, 00:18:34.175 "data_size": 65536 00:18:34.175 } 00:18:34.175 ] 00:18:34.175 }' 00:18:34.175 22:47:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:34.175 22:47:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:34.742 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:35.002 [2024-07-15 22:47:19.673310] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.002 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:35.261 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.261 "name": "Existed_Raid", 00:18:35.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.261 "strip_size_kb": 64, 00:18:35.261 "state": "configuring", 00:18:35.261 "raid_level": "raid0", 00:18:35.261 "superblock": false, 00:18:35.261 "num_base_bdevs": 4, 00:18:35.261 "num_base_bdevs_discovered": 2, 00:18:35.261 "num_base_bdevs_operational": 4, 00:18:35.261 "base_bdevs_list": [ 00:18:35.261 { 00:18:35.261 "name": "BaseBdev1", 00:18:35.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.261 "is_configured": false, 00:18:35.261 "data_offset": 0, 00:18:35.261 "data_size": 0 00:18:35.261 }, 00:18:35.261 { 00:18:35.261 "name": null, 00:18:35.261 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:35.261 "is_configured": false, 00:18:35.261 "data_offset": 0, 00:18:35.261 "data_size": 65536 00:18:35.261 }, 00:18:35.261 { 00:18:35.261 "name": "BaseBdev3", 00:18:35.261 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:35.261 "is_configured": true, 00:18:35.261 "data_offset": 0, 00:18:35.261 "data_size": 65536 00:18:35.261 }, 00:18:35.261 { 00:18:35.261 "name": "BaseBdev4", 00:18:35.261 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:35.261 "is_configured": true, 00:18:35.261 "data_offset": 0, 00:18:35.261 "data_size": 65536 00:18:35.261 } 00:18:35.261 ] 00:18:35.261 }' 00:18:35.261 22:47:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.261 22:47:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:36.198 22:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.198 22:47:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:36.457 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:36.457 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:36.457 [2024-07-15 22:47:21.354393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:36.457 BaseBdev1 00:18:36.717 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:36.717 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:36.717 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:36.717 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:36.717 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:36.717 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:36.717 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:36.976 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:37.236 [ 00:18:37.236 { 00:18:37.236 "name": "BaseBdev1", 00:18:37.236 "aliases": [ 00:18:37.236 "f40738c4-2e4c-4311-b3e4-c59d5347d141" 00:18:37.236 ], 00:18:37.236 "product_name": "Malloc disk", 00:18:37.236 "block_size": 512, 00:18:37.236 "num_blocks": 65536, 00:18:37.236 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:37.236 "assigned_rate_limits": { 00:18:37.236 "rw_ios_per_sec": 0, 00:18:37.236 "rw_mbytes_per_sec": 0, 00:18:37.236 "r_mbytes_per_sec": 0, 00:18:37.236 "w_mbytes_per_sec": 0 00:18:37.236 }, 00:18:37.236 "claimed": true, 00:18:37.236 "claim_type": "exclusive_write", 00:18:37.236 "zoned": false, 00:18:37.236 "supported_io_types": { 00:18:37.236 "read": true, 00:18:37.236 "write": true, 00:18:37.236 "unmap": true, 00:18:37.236 "flush": true, 00:18:37.236 "reset": true, 00:18:37.236 "nvme_admin": false, 00:18:37.236 "nvme_io": false, 00:18:37.236 "nvme_io_md": false, 00:18:37.236 "write_zeroes": true, 00:18:37.236 "zcopy": true, 00:18:37.236 "get_zone_info": false, 00:18:37.236 "zone_management": false, 00:18:37.236 "zone_append": false, 00:18:37.236 "compare": false, 00:18:37.236 "compare_and_write": false, 00:18:37.236 "abort": true, 00:18:37.236 "seek_hole": false, 00:18:37.236 "seek_data": false, 00:18:37.236 "copy": true, 00:18:37.236 "nvme_iov_md": false 00:18:37.236 }, 00:18:37.236 "memory_domains": [ 00:18:37.236 { 00:18:37.236 "dma_device_id": "system", 00:18:37.236 "dma_device_type": 1 00:18:37.236 }, 00:18:37.236 { 00:18:37.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:37.236 "dma_device_type": 2 00:18:37.236 } 00:18:37.236 ], 00:18:37.236 "driver_specific": {} 00:18:37.236 } 00:18:37.236 ] 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.236 22:47:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:37.495 22:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.495 "name": "Existed_Raid", 00:18:37.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:37.495 "strip_size_kb": 64, 00:18:37.495 "state": "configuring", 00:18:37.495 "raid_level": "raid0", 00:18:37.495 "superblock": false, 00:18:37.495 "num_base_bdevs": 4, 00:18:37.495 "num_base_bdevs_discovered": 3, 00:18:37.495 "num_base_bdevs_operational": 4, 00:18:37.495 "base_bdevs_list": [ 00:18:37.495 { 00:18:37.495 "name": "BaseBdev1", 00:18:37.495 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:37.495 "is_configured": true, 00:18:37.495 "data_offset": 0, 00:18:37.495 "data_size": 65536 00:18:37.495 }, 00:18:37.495 { 00:18:37.495 "name": null, 00:18:37.495 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:37.495 "is_configured": false, 00:18:37.495 "data_offset": 0, 00:18:37.495 "data_size": 65536 00:18:37.495 }, 00:18:37.495 { 00:18:37.495 "name": "BaseBdev3", 00:18:37.495 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:37.495 "is_configured": true, 00:18:37.495 "data_offset": 0, 00:18:37.495 "data_size": 65536 00:18:37.495 }, 00:18:37.495 { 00:18:37.495 "name": "BaseBdev4", 00:18:37.495 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:37.495 "is_configured": true, 00:18:37.495 "data_offset": 0, 00:18:37.495 "data_size": 65536 00:18:37.495 } 00:18:37.495 ] 00:18:37.495 }' 00:18:37.495 22:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.495 22:47:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:38.062 22:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.062 22:47:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:38.628 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:38.628 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:39.194 [2024-07-15 22:47:23.897157] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.194 22:47:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:39.452 22:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.452 "name": "Existed_Raid", 00:18:39.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:39.452 "strip_size_kb": 64, 00:18:39.452 "state": "configuring", 00:18:39.452 "raid_level": "raid0", 00:18:39.452 "superblock": false, 00:18:39.452 "num_base_bdevs": 4, 00:18:39.452 "num_base_bdevs_discovered": 2, 00:18:39.452 "num_base_bdevs_operational": 4, 00:18:39.452 "base_bdevs_list": [ 00:18:39.452 { 00:18:39.452 "name": "BaseBdev1", 00:18:39.452 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:39.452 "is_configured": true, 00:18:39.452 "data_offset": 0, 00:18:39.452 "data_size": 65536 00:18:39.452 }, 00:18:39.452 { 00:18:39.452 "name": null, 00:18:39.452 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:39.452 "is_configured": false, 00:18:39.452 "data_offset": 0, 00:18:39.452 "data_size": 65536 00:18:39.452 }, 00:18:39.452 { 00:18:39.452 "name": null, 00:18:39.452 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:39.452 "is_configured": false, 00:18:39.452 "data_offset": 0, 00:18:39.452 "data_size": 65536 00:18:39.452 }, 00:18:39.452 { 00:18:39.452 "name": "BaseBdev4", 00:18:39.452 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:39.452 "is_configured": true, 00:18:39.452 "data_offset": 0, 00:18:39.452 "data_size": 65536 00:18:39.452 } 00:18:39.452 ] 00:18:39.452 }' 00:18:39.452 22:47:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.452 22:47:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.385 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.385 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:40.642 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:40.642 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:40.900 [2024-07-15 22:47:25.557594] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.900 22:47:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.465 22:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.465 "name": "Existed_Raid", 00:18:41.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.465 "strip_size_kb": 64, 00:18:41.465 "state": "configuring", 00:18:41.465 "raid_level": "raid0", 00:18:41.465 "superblock": false, 00:18:41.465 "num_base_bdevs": 4, 00:18:41.465 "num_base_bdevs_discovered": 3, 00:18:41.465 "num_base_bdevs_operational": 4, 00:18:41.465 "base_bdevs_list": [ 00:18:41.465 { 00:18:41.465 "name": "BaseBdev1", 00:18:41.465 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:41.465 "is_configured": true, 00:18:41.465 "data_offset": 0, 00:18:41.465 "data_size": 65536 00:18:41.465 }, 00:18:41.465 { 00:18:41.465 "name": null, 00:18:41.465 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:41.465 "is_configured": false, 00:18:41.465 "data_offset": 0, 00:18:41.465 "data_size": 65536 00:18:41.465 }, 00:18:41.465 { 00:18:41.465 "name": "BaseBdev3", 00:18:41.465 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:41.465 "is_configured": true, 00:18:41.465 "data_offset": 0, 00:18:41.465 "data_size": 65536 00:18:41.465 }, 00:18:41.465 { 00:18:41.465 "name": "BaseBdev4", 00:18:41.465 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:41.465 "is_configured": true, 00:18:41.465 "data_offset": 0, 00:18:41.465 "data_size": 65536 00:18:41.465 } 00:18:41.465 ] 00:18:41.465 }' 00:18:41.465 22:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.465 22:47:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.401 22:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.401 22:47:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:42.401 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:42.401 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:42.968 [2024-07-15 22:47:27.695298] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.968 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.227 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:43.227 "name": "Existed_Raid", 00:18:43.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:43.227 "strip_size_kb": 64, 00:18:43.227 "state": "configuring", 00:18:43.227 "raid_level": "raid0", 00:18:43.227 "superblock": false, 00:18:43.227 "num_base_bdevs": 4, 00:18:43.227 "num_base_bdevs_discovered": 2, 00:18:43.227 "num_base_bdevs_operational": 4, 00:18:43.228 "base_bdevs_list": [ 00:18:43.228 { 00:18:43.228 "name": null, 00:18:43.228 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:43.228 "is_configured": false, 00:18:43.228 "data_offset": 0, 00:18:43.228 "data_size": 65536 00:18:43.228 }, 00:18:43.228 { 00:18:43.228 "name": null, 00:18:43.228 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:43.228 "is_configured": false, 00:18:43.228 "data_offset": 0, 00:18:43.228 "data_size": 65536 00:18:43.228 }, 00:18:43.228 { 00:18:43.228 "name": "BaseBdev3", 00:18:43.228 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:43.228 "is_configured": true, 00:18:43.228 "data_offset": 0, 00:18:43.228 "data_size": 65536 00:18:43.228 }, 00:18:43.228 { 00:18:43.228 "name": "BaseBdev4", 00:18:43.228 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:43.228 "is_configured": true, 00:18:43.228 "data_offset": 0, 00:18:43.228 "data_size": 65536 00:18:43.228 } 00:18:43.228 ] 00:18:43.228 }' 00:18:43.228 22:47:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:43.228 22:47:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:43.796 22:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.796 22:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:44.055 22:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:44.055 22:47:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:44.623 [2024-07-15 22:47:29.316062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:44.623 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:44.623 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.623 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.623 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.623 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.623 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.624 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.624 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.624 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.624 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.624 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.624 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.883 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.883 "name": "Existed_Raid", 00:18:44.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.883 "strip_size_kb": 64, 00:18:44.883 "state": "configuring", 00:18:44.883 "raid_level": "raid0", 00:18:44.883 "superblock": false, 00:18:44.883 "num_base_bdevs": 4, 00:18:44.883 "num_base_bdevs_discovered": 3, 00:18:44.883 "num_base_bdevs_operational": 4, 00:18:44.883 "base_bdevs_list": [ 00:18:44.883 { 00:18:44.883 "name": null, 00:18:44.883 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:44.883 "is_configured": false, 00:18:44.883 "data_offset": 0, 00:18:44.883 "data_size": 65536 00:18:44.883 }, 00:18:44.883 { 00:18:44.883 "name": "BaseBdev2", 00:18:44.883 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:44.883 "is_configured": true, 00:18:44.883 "data_offset": 0, 00:18:44.883 "data_size": 65536 00:18:44.883 }, 00:18:44.883 { 00:18:44.883 "name": "BaseBdev3", 00:18:44.883 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:44.883 "is_configured": true, 00:18:44.883 "data_offset": 0, 00:18:44.883 "data_size": 65536 00:18:44.883 }, 00:18:44.883 { 00:18:44.883 "name": "BaseBdev4", 00:18:44.883 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:44.883 "is_configured": true, 00:18:44.883 "data_offset": 0, 00:18:44.883 "data_size": 65536 00:18:44.883 } 00:18:44.883 ] 00:18:44.883 }' 00:18:44.883 22:47:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.883 22:47:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.451 22:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.451 22:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:45.710 22:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:45.710 22:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.710 22:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:45.969 22:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f40738c4-2e4c-4311-b3e4-c59d5347d141 00:18:46.228 [2024-07-15 22:47:30.912911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:46.228 [2024-07-15 22:47:30.912961] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbe3040 00:18:46.228 [2024-07-15 22:47:30.912971] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:46.228 [2024-07-15 22:47:30.913172] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbdea70 00:18:46.228 [2024-07-15 22:47:30.913292] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbe3040 00:18:46.228 [2024-07-15 22:47:30.913302] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xbe3040 00:18:46.228 [2024-07-15 22:47:30.913468] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:46.228 NewBaseBdev 00:18:46.228 22:47:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:46.228 22:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:46.228 22:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:46.228 22:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:46.228 22:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:46.228 22:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:46.228 22:47:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:46.796 22:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:46.796 [ 00:18:46.796 { 00:18:46.796 "name": "NewBaseBdev", 00:18:46.796 "aliases": [ 00:18:46.796 "f40738c4-2e4c-4311-b3e4-c59d5347d141" 00:18:46.796 ], 00:18:46.796 "product_name": "Malloc disk", 00:18:46.796 "block_size": 512, 00:18:46.796 "num_blocks": 65536, 00:18:46.796 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:46.796 "assigned_rate_limits": { 00:18:46.796 "rw_ios_per_sec": 0, 00:18:46.796 "rw_mbytes_per_sec": 0, 00:18:46.796 "r_mbytes_per_sec": 0, 00:18:46.796 "w_mbytes_per_sec": 0 00:18:46.796 }, 00:18:46.796 "claimed": true, 00:18:46.796 "claim_type": "exclusive_write", 00:18:46.796 "zoned": false, 00:18:46.796 "supported_io_types": { 00:18:46.796 "read": true, 00:18:46.796 "write": true, 00:18:46.796 "unmap": true, 00:18:46.796 "flush": true, 00:18:46.796 "reset": true, 00:18:46.796 "nvme_admin": false, 00:18:46.796 "nvme_io": false, 00:18:46.796 "nvme_io_md": false, 00:18:46.796 "write_zeroes": true, 00:18:46.796 "zcopy": true, 00:18:46.796 "get_zone_info": false, 00:18:46.796 "zone_management": false, 00:18:46.796 "zone_append": false, 00:18:46.796 "compare": false, 00:18:46.796 "compare_and_write": false, 00:18:46.796 "abort": true, 00:18:46.796 "seek_hole": false, 00:18:46.796 "seek_data": false, 00:18:46.796 "copy": true, 00:18:46.796 "nvme_iov_md": false 00:18:46.796 }, 00:18:46.796 "memory_domains": [ 00:18:46.796 { 00:18:46.796 "dma_device_id": "system", 00:18:46.796 "dma_device_type": 1 00:18:46.796 }, 00:18:46.796 { 00:18:46.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:46.796 "dma_device_type": 2 00:18:46.796 } 00:18:46.796 ], 00:18:46.796 "driver_specific": {} 00:18:46.796 } 00:18:46.796 ] 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.055 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.316 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.316 "name": "Existed_Raid", 00:18:47.316 "uuid": "7acdcf5f-2746-4db5-a3ab-bdf795d5b37d", 00:18:47.316 "strip_size_kb": 64, 00:18:47.316 "state": "online", 00:18:47.316 "raid_level": "raid0", 00:18:47.316 "superblock": false, 00:18:47.316 "num_base_bdevs": 4, 00:18:47.316 "num_base_bdevs_discovered": 4, 00:18:47.316 "num_base_bdevs_operational": 4, 00:18:47.316 "base_bdevs_list": [ 00:18:47.316 { 00:18:47.316 "name": "NewBaseBdev", 00:18:47.316 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:47.316 "is_configured": true, 00:18:47.316 "data_offset": 0, 00:18:47.316 "data_size": 65536 00:18:47.316 }, 00:18:47.316 { 00:18:47.316 "name": "BaseBdev2", 00:18:47.316 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:47.316 "is_configured": true, 00:18:47.316 "data_offset": 0, 00:18:47.316 "data_size": 65536 00:18:47.316 }, 00:18:47.316 { 00:18:47.316 "name": "BaseBdev3", 00:18:47.316 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:47.316 "is_configured": true, 00:18:47.316 "data_offset": 0, 00:18:47.316 "data_size": 65536 00:18:47.316 }, 00:18:47.316 { 00:18:47.316 "name": "BaseBdev4", 00:18:47.316 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:47.316 "is_configured": true, 00:18:47.316 "data_offset": 0, 00:18:47.316 "data_size": 65536 00:18:47.316 } 00:18:47.316 ] 00:18:47.316 }' 00:18:47.316 22:47:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.316 22:47:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:47.925 [2024-07-15 22:47:32.802277] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:47.925 "name": "Existed_Raid", 00:18:47.925 "aliases": [ 00:18:47.925 "7acdcf5f-2746-4db5-a3ab-bdf795d5b37d" 00:18:47.925 ], 00:18:47.925 "product_name": "Raid Volume", 00:18:47.925 "block_size": 512, 00:18:47.925 "num_blocks": 262144, 00:18:47.925 "uuid": "7acdcf5f-2746-4db5-a3ab-bdf795d5b37d", 00:18:47.925 "assigned_rate_limits": { 00:18:47.925 "rw_ios_per_sec": 0, 00:18:47.925 "rw_mbytes_per_sec": 0, 00:18:47.925 "r_mbytes_per_sec": 0, 00:18:47.925 "w_mbytes_per_sec": 0 00:18:47.925 }, 00:18:47.925 "claimed": false, 00:18:47.925 "zoned": false, 00:18:47.925 "supported_io_types": { 00:18:47.925 "read": true, 00:18:47.925 "write": true, 00:18:47.925 "unmap": true, 00:18:47.925 "flush": true, 00:18:47.925 "reset": true, 00:18:47.925 "nvme_admin": false, 00:18:47.925 "nvme_io": false, 00:18:47.925 "nvme_io_md": false, 00:18:47.925 "write_zeroes": true, 00:18:47.925 "zcopy": false, 00:18:47.925 "get_zone_info": false, 00:18:47.925 "zone_management": false, 00:18:47.925 "zone_append": false, 00:18:47.925 "compare": false, 00:18:47.925 "compare_and_write": false, 00:18:47.925 "abort": false, 00:18:47.925 "seek_hole": false, 00:18:47.925 "seek_data": false, 00:18:47.925 "copy": false, 00:18:47.925 "nvme_iov_md": false 00:18:47.925 }, 00:18:47.925 "memory_domains": [ 00:18:47.925 { 00:18:47.925 "dma_device_id": "system", 00:18:47.925 "dma_device_type": 1 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.925 "dma_device_type": 2 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "dma_device_id": "system", 00:18:47.925 "dma_device_type": 1 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.925 "dma_device_type": 2 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "dma_device_id": "system", 00:18:47.925 "dma_device_type": 1 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.925 "dma_device_type": 2 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "dma_device_id": "system", 00:18:47.925 "dma_device_type": 1 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:47.925 "dma_device_type": 2 00:18:47.925 } 00:18:47.925 ], 00:18:47.925 "driver_specific": { 00:18:47.925 "raid": { 00:18:47.925 "uuid": "7acdcf5f-2746-4db5-a3ab-bdf795d5b37d", 00:18:47.925 "strip_size_kb": 64, 00:18:47.925 "state": "online", 00:18:47.925 "raid_level": "raid0", 00:18:47.925 "superblock": false, 00:18:47.925 "num_base_bdevs": 4, 00:18:47.925 "num_base_bdevs_discovered": 4, 00:18:47.925 "num_base_bdevs_operational": 4, 00:18:47.925 "base_bdevs_list": [ 00:18:47.925 { 00:18:47.925 "name": "NewBaseBdev", 00:18:47.925 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:47.925 "is_configured": true, 00:18:47.925 "data_offset": 0, 00:18:47.925 "data_size": 65536 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "name": "BaseBdev2", 00:18:47.925 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:47.925 "is_configured": true, 00:18:47.925 "data_offset": 0, 00:18:47.925 "data_size": 65536 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "name": "BaseBdev3", 00:18:47.925 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:47.925 "is_configured": true, 00:18:47.925 "data_offset": 0, 00:18:47.925 "data_size": 65536 00:18:47.925 }, 00:18:47.925 { 00:18:47.925 "name": "BaseBdev4", 00:18:47.925 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:47.925 "is_configured": true, 00:18:47.925 "data_offset": 0, 00:18:47.925 "data_size": 65536 00:18:47.925 } 00:18:47.925 ] 00:18:47.925 } 00:18:47.925 } 00:18:47.925 }' 00:18:47.925 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:48.184 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:48.184 BaseBdev2 00:18:48.184 BaseBdev3 00:18:48.184 BaseBdev4' 00:18:48.184 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.184 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:48.184 22:47:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.442 "name": "NewBaseBdev", 00:18:48.442 "aliases": [ 00:18:48.442 "f40738c4-2e4c-4311-b3e4-c59d5347d141" 00:18:48.442 ], 00:18:48.442 "product_name": "Malloc disk", 00:18:48.442 "block_size": 512, 00:18:48.442 "num_blocks": 65536, 00:18:48.442 "uuid": "f40738c4-2e4c-4311-b3e4-c59d5347d141", 00:18:48.442 "assigned_rate_limits": { 00:18:48.442 "rw_ios_per_sec": 0, 00:18:48.442 "rw_mbytes_per_sec": 0, 00:18:48.442 "r_mbytes_per_sec": 0, 00:18:48.442 "w_mbytes_per_sec": 0 00:18:48.442 }, 00:18:48.442 "claimed": true, 00:18:48.442 "claim_type": "exclusive_write", 00:18:48.442 "zoned": false, 00:18:48.442 "supported_io_types": { 00:18:48.442 "read": true, 00:18:48.442 "write": true, 00:18:48.442 "unmap": true, 00:18:48.442 "flush": true, 00:18:48.442 "reset": true, 00:18:48.442 "nvme_admin": false, 00:18:48.442 "nvme_io": false, 00:18:48.442 "nvme_io_md": false, 00:18:48.442 "write_zeroes": true, 00:18:48.442 "zcopy": true, 00:18:48.442 "get_zone_info": false, 00:18:48.442 "zone_management": false, 00:18:48.442 "zone_append": false, 00:18:48.442 "compare": false, 00:18:48.442 "compare_and_write": false, 00:18:48.442 "abort": true, 00:18:48.442 "seek_hole": false, 00:18:48.442 "seek_data": false, 00:18:48.442 "copy": true, 00:18:48.442 "nvme_iov_md": false 00:18:48.442 }, 00:18:48.442 "memory_domains": [ 00:18:48.442 { 00:18:48.442 "dma_device_id": "system", 00:18:48.442 "dma_device_type": 1 00:18:48.442 }, 00:18:48.442 { 00:18:48.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.442 "dma_device_type": 2 00:18:48.442 } 00:18:48.442 ], 00:18:48.442 "driver_specific": {} 00:18:48.442 }' 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.442 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:48.700 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:48.700 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.700 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:48.700 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:48.700 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.700 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:48.700 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.957 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.957 "name": "BaseBdev2", 00:18:48.957 "aliases": [ 00:18:48.957 "35fbfc34-f2e8-47a4-874f-9449a786257f" 00:18:48.957 ], 00:18:48.957 "product_name": "Malloc disk", 00:18:48.957 "block_size": 512, 00:18:48.957 "num_blocks": 65536, 00:18:48.957 "uuid": "35fbfc34-f2e8-47a4-874f-9449a786257f", 00:18:48.957 "assigned_rate_limits": { 00:18:48.957 "rw_ios_per_sec": 0, 00:18:48.957 "rw_mbytes_per_sec": 0, 00:18:48.957 "r_mbytes_per_sec": 0, 00:18:48.957 "w_mbytes_per_sec": 0 00:18:48.957 }, 00:18:48.957 "claimed": true, 00:18:48.957 "claim_type": "exclusive_write", 00:18:48.957 "zoned": false, 00:18:48.957 "supported_io_types": { 00:18:48.957 "read": true, 00:18:48.957 "write": true, 00:18:48.957 "unmap": true, 00:18:48.957 "flush": true, 00:18:48.957 "reset": true, 00:18:48.957 "nvme_admin": false, 00:18:48.957 "nvme_io": false, 00:18:48.957 "nvme_io_md": false, 00:18:48.957 "write_zeroes": true, 00:18:48.957 "zcopy": true, 00:18:48.957 "get_zone_info": false, 00:18:48.957 "zone_management": false, 00:18:48.957 "zone_append": false, 00:18:48.957 "compare": false, 00:18:48.957 "compare_and_write": false, 00:18:48.957 "abort": true, 00:18:48.957 "seek_hole": false, 00:18:48.957 "seek_data": false, 00:18:48.957 "copy": true, 00:18:48.957 "nvme_iov_md": false 00:18:48.957 }, 00:18:48.957 "memory_domains": [ 00:18:48.957 { 00:18:48.957 "dma_device_id": "system", 00:18:48.957 "dma_device_type": 1 00:18:48.957 }, 00:18:48.957 { 00:18:48.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.957 "dma_device_type": 2 00:18:48.957 } 00:18:48.957 ], 00:18:48.957 "driver_specific": {} 00:18:48.957 }' 00:18:48.957 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.957 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.957 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:48.957 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:48.957 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.215 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:49.215 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.215 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.215 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.215 22:47:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.215 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.215 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.215 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:49.215 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:49.215 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:49.474 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:49.474 "name": "BaseBdev3", 00:18:49.474 "aliases": [ 00:18:49.474 "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822" 00:18:49.474 ], 00:18:49.474 "product_name": "Malloc disk", 00:18:49.474 "block_size": 512, 00:18:49.474 "num_blocks": 65536, 00:18:49.474 "uuid": "d7cdcb26-c58a-4d63-b4ff-dcf9e4474822", 00:18:49.474 "assigned_rate_limits": { 00:18:49.474 "rw_ios_per_sec": 0, 00:18:49.474 "rw_mbytes_per_sec": 0, 00:18:49.474 "r_mbytes_per_sec": 0, 00:18:49.474 "w_mbytes_per_sec": 0 00:18:49.474 }, 00:18:49.474 "claimed": true, 00:18:49.474 "claim_type": "exclusive_write", 00:18:49.474 "zoned": false, 00:18:49.474 "supported_io_types": { 00:18:49.474 "read": true, 00:18:49.474 "write": true, 00:18:49.474 "unmap": true, 00:18:49.474 "flush": true, 00:18:49.474 "reset": true, 00:18:49.474 "nvme_admin": false, 00:18:49.474 "nvme_io": false, 00:18:49.474 "nvme_io_md": false, 00:18:49.474 "write_zeroes": true, 00:18:49.474 "zcopy": true, 00:18:49.474 "get_zone_info": false, 00:18:49.474 "zone_management": false, 00:18:49.474 "zone_append": false, 00:18:49.474 "compare": false, 00:18:49.474 "compare_and_write": false, 00:18:49.474 "abort": true, 00:18:49.474 "seek_hole": false, 00:18:49.474 "seek_data": false, 00:18:49.474 "copy": true, 00:18:49.474 "nvme_iov_md": false 00:18:49.474 }, 00:18:49.474 "memory_domains": [ 00:18:49.474 { 00:18:49.474 "dma_device_id": "system", 00:18:49.474 "dma_device_type": 1 00:18:49.474 }, 00:18:49.474 { 00:18:49.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.474 "dma_device_type": 2 00:18:49.474 } 00:18:49.474 ], 00:18:49.474 "driver_specific": {} 00:18:49.474 }' 00:18:49.474 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.474 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.733 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.992 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.992 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:49.992 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:49.992 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:49.992 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:49.992 "name": "BaseBdev4", 00:18:49.992 "aliases": [ 00:18:49.992 "376f2fc1-bca2-4d32-9cea-dd1db437be97" 00:18:49.992 ], 00:18:49.992 "product_name": "Malloc disk", 00:18:49.992 "block_size": 512, 00:18:49.992 "num_blocks": 65536, 00:18:49.992 "uuid": "376f2fc1-bca2-4d32-9cea-dd1db437be97", 00:18:49.992 "assigned_rate_limits": { 00:18:49.992 "rw_ios_per_sec": 0, 00:18:49.992 "rw_mbytes_per_sec": 0, 00:18:49.992 "r_mbytes_per_sec": 0, 00:18:49.992 "w_mbytes_per_sec": 0 00:18:49.992 }, 00:18:49.992 "claimed": true, 00:18:49.992 "claim_type": "exclusive_write", 00:18:49.992 "zoned": false, 00:18:49.992 "supported_io_types": { 00:18:49.992 "read": true, 00:18:49.992 "write": true, 00:18:49.992 "unmap": true, 00:18:49.992 "flush": true, 00:18:49.992 "reset": true, 00:18:49.992 "nvme_admin": false, 00:18:49.992 "nvme_io": false, 00:18:49.992 "nvme_io_md": false, 00:18:49.992 "write_zeroes": true, 00:18:49.992 "zcopy": true, 00:18:49.992 "get_zone_info": false, 00:18:49.992 "zone_management": false, 00:18:49.992 "zone_append": false, 00:18:49.992 "compare": false, 00:18:49.992 "compare_and_write": false, 00:18:49.992 "abort": true, 00:18:49.992 "seek_hole": false, 00:18:49.992 "seek_data": false, 00:18:49.992 "copy": true, 00:18:49.992 "nvme_iov_md": false 00:18:49.992 }, 00:18:49.992 "memory_domains": [ 00:18:49.992 { 00:18:49.992 "dma_device_id": "system", 00:18:49.992 "dma_device_type": 1 00:18:49.992 }, 00:18:49.992 { 00:18:49.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.992 "dma_device_type": 2 00:18:49.992 } 00:18:49.992 ], 00:18:49.992 "driver_specific": {} 00:18:49.992 }' 00:18:49.992 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.250 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.250 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:50.250 22:47:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:50.250 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:50.250 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:50.250 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.250 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.250 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:50.250 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.509 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.509 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:50.509 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:50.768 [2024-07-15 22:47:35.461093] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:50.768 [2024-07-15 22:47:35.461119] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:50.768 [2024-07-15 22:47:35.461169] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:50.768 [2024-07-15 22:47:35.461230] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:50.768 [2024-07-15 22:47:35.461242] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbe3040 name Existed_Raid, state offline 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2760245 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2760245 ']' 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2760245 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2760245 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2760245' 00:18:50.768 killing process with pid 2760245 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2760245 00:18:50.768 [2024-07-15 22:47:35.526502] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:50.768 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2760245 00:18:50.768 [2024-07-15 22:47:35.563380] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:51.027 22:47:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:51.027 00:18:51.027 real 0m37.941s 00:18:51.027 user 1m9.978s 00:18:51.027 sys 0m6.485s 00:18:51.027 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:51.027 22:47:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.027 ************************************ 00:18:51.027 END TEST raid_state_function_test 00:18:51.027 ************************************ 00:18:51.027 22:47:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:51.027 22:47:35 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:51.027 22:47:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:51.027 22:47:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:51.027 22:47:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:51.027 ************************************ 00:18:51.027 START TEST raid_state_function_test_sb 00:18:51.027 ************************************ 00:18:51.027 22:47:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:51.027 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2765813 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2765813' 00:18:51.028 Process raid pid: 2765813 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2765813 /var/tmp/spdk-raid.sock 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2765813 ']' 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:51.028 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:51.028 22:47:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:51.028 [2024-07-15 22:47:35.919710] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:18:51.028 [2024-07-15 22:47:35.919776] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:51.287 [2024-07-15 22:47:36.065182] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:51.545 [2024-07-15 22:47:36.204152] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:51.545 [2024-07-15 22:47:36.275536] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:51.545 [2024-07-15 22:47:36.275573] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:52.111 22:47:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:52.111 22:47:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:52.111 22:47:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:52.369 [2024-07-15 22:47:37.155935] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:52.370 [2024-07-15 22:47:37.155975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:52.370 [2024-07-15 22:47:37.155987] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:52.370 [2024-07-15 22:47:37.155998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:52.370 [2024-07-15 22:47:37.156007] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:52.370 [2024-07-15 22:47:37.156018] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:52.370 [2024-07-15 22:47:37.156027] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:52.370 [2024-07-15 22:47:37.156039] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.370 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:52.628 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.628 "name": "Existed_Raid", 00:18:52.628 "uuid": "bbb2f71b-db39-4de1-b56b-2f7971551271", 00:18:52.628 "strip_size_kb": 64, 00:18:52.628 "state": "configuring", 00:18:52.628 "raid_level": "raid0", 00:18:52.628 "superblock": true, 00:18:52.628 "num_base_bdevs": 4, 00:18:52.628 "num_base_bdevs_discovered": 0, 00:18:52.628 "num_base_bdevs_operational": 4, 00:18:52.628 "base_bdevs_list": [ 00:18:52.628 { 00:18:52.628 "name": "BaseBdev1", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.628 "is_configured": false, 00:18:52.628 "data_offset": 0, 00:18:52.628 "data_size": 0 00:18:52.628 }, 00:18:52.628 { 00:18:52.628 "name": "BaseBdev2", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.628 "is_configured": false, 00:18:52.628 "data_offset": 0, 00:18:52.628 "data_size": 0 00:18:52.628 }, 00:18:52.628 { 00:18:52.628 "name": "BaseBdev3", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.628 "is_configured": false, 00:18:52.628 "data_offset": 0, 00:18:52.628 "data_size": 0 00:18:52.628 }, 00:18:52.628 { 00:18:52.628 "name": "BaseBdev4", 00:18:52.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:52.628 "is_configured": false, 00:18:52.628 "data_offset": 0, 00:18:52.628 "data_size": 0 00:18:52.628 } 00:18:52.628 ] 00:18:52.628 }' 00:18:52.628 22:47:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.628 22:47:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:53.193 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:53.450 [2024-07-15 22:47:38.230634] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:53.450 [2024-07-15 22:47:38.230664] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198caa0 name Existed_Raid, state configuring 00:18:53.450 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:53.709 [2024-07-15 22:47:38.411139] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:53.709 [2024-07-15 22:47:38.411163] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:53.709 [2024-07-15 22:47:38.411173] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:53.709 [2024-07-15 22:47:38.411185] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:53.709 [2024-07-15 22:47:38.411193] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:53.709 [2024-07-15 22:47:38.411204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:53.709 [2024-07-15 22:47:38.411213] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:53.709 [2024-07-15 22:47:38.411224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:53.709 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:53.709 [2024-07-15 22:47:38.601573] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:53.709 BaseBdev1 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:53.967 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:54.243 [ 00:18:54.243 { 00:18:54.243 "name": "BaseBdev1", 00:18:54.243 "aliases": [ 00:18:54.243 "b776235e-581a-435b-8ebf-0476258e037e" 00:18:54.243 ], 00:18:54.243 "product_name": "Malloc disk", 00:18:54.243 "block_size": 512, 00:18:54.243 "num_blocks": 65536, 00:18:54.243 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:18:54.243 "assigned_rate_limits": { 00:18:54.243 "rw_ios_per_sec": 0, 00:18:54.243 "rw_mbytes_per_sec": 0, 00:18:54.243 "r_mbytes_per_sec": 0, 00:18:54.243 "w_mbytes_per_sec": 0 00:18:54.243 }, 00:18:54.243 "claimed": true, 00:18:54.243 "claim_type": "exclusive_write", 00:18:54.243 "zoned": false, 00:18:54.243 "supported_io_types": { 00:18:54.243 "read": true, 00:18:54.243 "write": true, 00:18:54.243 "unmap": true, 00:18:54.243 "flush": true, 00:18:54.243 "reset": true, 00:18:54.243 "nvme_admin": false, 00:18:54.243 "nvme_io": false, 00:18:54.243 "nvme_io_md": false, 00:18:54.243 "write_zeroes": true, 00:18:54.243 "zcopy": true, 00:18:54.243 "get_zone_info": false, 00:18:54.243 "zone_management": false, 00:18:54.243 "zone_append": false, 00:18:54.243 "compare": false, 00:18:54.243 "compare_and_write": false, 00:18:54.243 "abort": true, 00:18:54.243 "seek_hole": false, 00:18:54.243 "seek_data": false, 00:18:54.243 "copy": true, 00:18:54.243 "nvme_iov_md": false 00:18:54.243 }, 00:18:54.243 "memory_domains": [ 00:18:54.243 { 00:18:54.243 "dma_device_id": "system", 00:18:54.243 "dma_device_type": 1 00:18:54.243 }, 00:18:54.243 { 00:18:54.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.243 "dma_device_type": 2 00:18:54.243 } 00:18:54.243 ], 00:18:54.243 "driver_specific": {} 00:18:54.243 } 00:18:54.243 ] 00:18:54.243 22:47:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:54.243 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:54.243 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.244 22:47:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.501 22:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.501 "name": "Existed_Raid", 00:18:54.501 "uuid": "1dd00d6d-a481-4b2f-aecf-99084c2c87db", 00:18:54.501 "strip_size_kb": 64, 00:18:54.501 "state": "configuring", 00:18:54.501 "raid_level": "raid0", 00:18:54.501 "superblock": true, 00:18:54.501 "num_base_bdevs": 4, 00:18:54.501 "num_base_bdevs_discovered": 1, 00:18:54.501 "num_base_bdevs_operational": 4, 00:18:54.501 "base_bdevs_list": [ 00:18:54.501 { 00:18:54.501 "name": "BaseBdev1", 00:18:54.501 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:18:54.501 "is_configured": true, 00:18:54.501 "data_offset": 2048, 00:18:54.501 "data_size": 63488 00:18:54.501 }, 00:18:54.501 { 00:18:54.501 "name": "BaseBdev2", 00:18:54.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.501 "is_configured": false, 00:18:54.501 "data_offset": 0, 00:18:54.501 "data_size": 0 00:18:54.501 }, 00:18:54.501 { 00:18:54.501 "name": "BaseBdev3", 00:18:54.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.501 "is_configured": false, 00:18:54.501 "data_offset": 0, 00:18:54.501 "data_size": 0 00:18:54.501 }, 00:18:54.501 { 00:18:54.501 "name": "BaseBdev4", 00:18:54.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.501 "is_configured": false, 00:18:54.501 "data_offset": 0, 00:18:54.501 "data_size": 0 00:18:54.501 } 00:18:54.501 ] 00:18:54.501 }' 00:18:54.501 22:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.501 22:47:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:55.066 22:47:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:55.324 [2024-07-15 22:47:40.061450] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:55.324 [2024-07-15 22:47:40.061491] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198c310 name Existed_Raid, state configuring 00:18:55.324 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:55.887 [2024-07-15 22:47:40.562812] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:55.887 [2024-07-15 22:47:40.564318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:55.887 [2024-07-15 22:47:40.564352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:55.887 [2024-07-15 22:47:40.564363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:55.887 [2024-07-15 22:47:40.564374] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:55.887 [2024-07-15 22:47:40.564383] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:55.887 [2024-07-15 22:47:40.564395] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.887 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:56.144 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.144 "name": "Existed_Raid", 00:18:56.144 "uuid": "28507820-993d-4d89-979b-45314c28ebef", 00:18:56.144 "strip_size_kb": 64, 00:18:56.144 "state": "configuring", 00:18:56.144 "raid_level": "raid0", 00:18:56.144 "superblock": true, 00:18:56.144 "num_base_bdevs": 4, 00:18:56.144 "num_base_bdevs_discovered": 1, 00:18:56.144 "num_base_bdevs_operational": 4, 00:18:56.144 "base_bdevs_list": [ 00:18:56.144 { 00:18:56.144 "name": "BaseBdev1", 00:18:56.144 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:18:56.144 "is_configured": true, 00:18:56.144 "data_offset": 2048, 00:18:56.144 "data_size": 63488 00:18:56.144 }, 00:18:56.144 { 00:18:56.144 "name": "BaseBdev2", 00:18:56.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.144 "is_configured": false, 00:18:56.144 "data_offset": 0, 00:18:56.144 "data_size": 0 00:18:56.144 }, 00:18:56.144 { 00:18:56.144 "name": "BaseBdev3", 00:18:56.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.144 "is_configured": false, 00:18:56.144 "data_offset": 0, 00:18:56.144 "data_size": 0 00:18:56.144 }, 00:18:56.144 { 00:18:56.144 "name": "BaseBdev4", 00:18:56.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.144 "is_configured": false, 00:18:56.144 "data_offset": 0, 00:18:56.144 "data_size": 0 00:18:56.144 } 00:18:56.144 ] 00:18:56.144 }' 00:18:56.144 22:47:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.144 22:47:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:56.708 22:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:56.992 [2024-07-15 22:47:41.654321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:56.992 BaseBdev2 00:18:56.992 22:47:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:56.992 22:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:56.992 22:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:56.992 22:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:56.992 22:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:56.992 22:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:56.992 22:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:57.250 22:47:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:57.250 [ 00:18:57.250 { 00:18:57.250 "name": "BaseBdev2", 00:18:57.250 "aliases": [ 00:18:57.250 "475fa52f-04a7-456c-8ecb-5e4bb0df78a3" 00:18:57.250 ], 00:18:57.250 "product_name": "Malloc disk", 00:18:57.250 "block_size": 512, 00:18:57.250 "num_blocks": 65536, 00:18:57.250 "uuid": "475fa52f-04a7-456c-8ecb-5e4bb0df78a3", 00:18:57.250 "assigned_rate_limits": { 00:18:57.250 "rw_ios_per_sec": 0, 00:18:57.250 "rw_mbytes_per_sec": 0, 00:18:57.250 "r_mbytes_per_sec": 0, 00:18:57.250 "w_mbytes_per_sec": 0 00:18:57.250 }, 00:18:57.250 "claimed": true, 00:18:57.250 "claim_type": "exclusive_write", 00:18:57.250 "zoned": false, 00:18:57.250 "supported_io_types": { 00:18:57.250 "read": true, 00:18:57.250 "write": true, 00:18:57.250 "unmap": true, 00:18:57.250 "flush": true, 00:18:57.250 "reset": true, 00:18:57.250 "nvme_admin": false, 00:18:57.250 "nvme_io": false, 00:18:57.250 "nvme_io_md": false, 00:18:57.250 "write_zeroes": true, 00:18:57.250 "zcopy": true, 00:18:57.250 "get_zone_info": false, 00:18:57.250 "zone_management": false, 00:18:57.250 "zone_append": false, 00:18:57.250 "compare": false, 00:18:57.250 "compare_and_write": false, 00:18:57.250 "abort": true, 00:18:57.250 "seek_hole": false, 00:18:57.250 "seek_data": false, 00:18:57.250 "copy": true, 00:18:57.250 "nvme_iov_md": false 00:18:57.250 }, 00:18:57.250 "memory_domains": [ 00:18:57.250 { 00:18:57.250 "dma_device_id": "system", 00:18:57.250 "dma_device_type": 1 00:18:57.250 }, 00:18:57.250 { 00:18:57.250 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:57.250 "dma_device_type": 2 00:18:57.250 } 00:18:57.250 ], 00:18:57.250 "driver_specific": {} 00:18:57.250 } 00:18:57.250 ] 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.508 "name": "Existed_Raid", 00:18:57.508 "uuid": "28507820-993d-4d89-979b-45314c28ebef", 00:18:57.508 "strip_size_kb": 64, 00:18:57.508 "state": "configuring", 00:18:57.508 "raid_level": "raid0", 00:18:57.508 "superblock": true, 00:18:57.508 "num_base_bdevs": 4, 00:18:57.508 "num_base_bdevs_discovered": 2, 00:18:57.508 "num_base_bdevs_operational": 4, 00:18:57.508 "base_bdevs_list": [ 00:18:57.508 { 00:18:57.508 "name": "BaseBdev1", 00:18:57.508 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:18:57.508 "is_configured": true, 00:18:57.508 "data_offset": 2048, 00:18:57.508 "data_size": 63488 00:18:57.508 }, 00:18:57.508 { 00:18:57.508 "name": "BaseBdev2", 00:18:57.508 "uuid": "475fa52f-04a7-456c-8ecb-5e4bb0df78a3", 00:18:57.508 "is_configured": true, 00:18:57.508 "data_offset": 2048, 00:18:57.508 "data_size": 63488 00:18:57.508 }, 00:18:57.508 { 00:18:57.508 "name": "BaseBdev3", 00:18:57.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.508 "is_configured": false, 00:18:57.508 "data_offset": 0, 00:18:57.508 "data_size": 0 00:18:57.508 }, 00:18:57.508 { 00:18:57.508 "name": "BaseBdev4", 00:18:57.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.508 "is_configured": false, 00:18:57.508 "data_offset": 0, 00:18:57.508 "data_size": 0 00:18:57.508 } 00:18:57.508 ] 00:18:57.508 }' 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.508 22:47:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:58.439 22:47:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:58.439 [2024-07-15 22:47:43.221889] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:58.439 BaseBdev3 00:18:58.439 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:58.439 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:58.439 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:58.439 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:58.439 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:58.439 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:58.439 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:58.696 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:58.953 [ 00:18:58.953 { 00:18:58.953 "name": "BaseBdev3", 00:18:58.953 "aliases": [ 00:18:58.953 "514f1821-008d-4af7-99c7-0a633b63c864" 00:18:58.953 ], 00:18:58.953 "product_name": "Malloc disk", 00:18:58.953 "block_size": 512, 00:18:58.953 "num_blocks": 65536, 00:18:58.953 "uuid": "514f1821-008d-4af7-99c7-0a633b63c864", 00:18:58.953 "assigned_rate_limits": { 00:18:58.953 "rw_ios_per_sec": 0, 00:18:58.953 "rw_mbytes_per_sec": 0, 00:18:58.953 "r_mbytes_per_sec": 0, 00:18:58.953 "w_mbytes_per_sec": 0 00:18:58.953 }, 00:18:58.953 "claimed": true, 00:18:58.953 "claim_type": "exclusive_write", 00:18:58.953 "zoned": false, 00:18:58.953 "supported_io_types": { 00:18:58.953 "read": true, 00:18:58.953 "write": true, 00:18:58.953 "unmap": true, 00:18:58.953 "flush": true, 00:18:58.953 "reset": true, 00:18:58.953 "nvme_admin": false, 00:18:58.953 "nvme_io": false, 00:18:58.953 "nvme_io_md": false, 00:18:58.953 "write_zeroes": true, 00:18:58.953 "zcopy": true, 00:18:58.953 "get_zone_info": false, 00:18:58.953 "zone_management": false, 00:18:58.953 "zone_append": false, 00:18:58.953 "compare": false, 00:18:58.953 "compare_and_write": false, 00:18:58.953 "abort": true, 00:18:58.953 "seek_hole": false, 00:18:58.953 "seek_data": false, 00:18:58.953 "copy": true, 00:18:58.953 "nvme_iov_md": false 00:18:58.953 }, 00:18:58.953 "memory_domains": [ 00:18:58.953 { 00:18:58.953 "dma_device_id": "system", 00:18:58.953 "dma_device_type": 1 00:18:58.953 }, 00:18:58.953 { 00:18:58.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:58.953 "dma_device_type": 2 00:18:58.953 } 00:18:58.953 ], 00:18:58.953 "driver_specific": {} 00:18:58.953 } 00:18:58.953 ] 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.953 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.210 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.210 "name": "Existed_Raid", 00:18:59.210 "uuid": "28507820-993d-4d89-979b-45314c28ebef", 00:18:59.210 "strip_size_kb": 64, 00:18:59.210 "state": "configuring", 00:18:59.210 "raid_level": "raid0", 00:18:59.210 "superblock": true, 00:18:59.210 "num_base_bdevs": 4, 00:18:59.210 "num_base_bdevs_discovered": 3, 00:18:59.210 "num_base_bdevs_operational": 4, 00:18:59.210 "base_bdevs_list": [ 00:18:59.210 { 00:18:59.210 "name": "BaseBdev1", 00:18:59.210 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:18:59.210 "is_configured": true, 00:18:59.210 "data_offset": 2048, 00:18:59.210 "data_size": 63488 00:18:59.210 }, 00:18:59.210 { 00:18:59.210 "name": "BaseBdev2", 00:18:59.210 "uuid": "475fa52f-04a7-456c-8ecb-5e4bb0df78a3", 00:18:59.210 "is_configured": true, 00:18:59.210 "data_offset": 2048, 00:18:59.210 "data_size": 63488 00:18:59.210 }, 00:18:59.210 { 00:18:59.210 "name": "BaseBdev3", 00:18:59.210 "uuid": "514f1821-008d-4af7-99c7-0a633b63c864", 00:18:59.210 "is_configured": true, 00:18:59.210 "data_offset": 2048, 00:18:59.210 "data_size": 63488 00:18:59.210 }, 00:18:59.210 { 00:18:59.210 "name": "BaseBdev4", 00:18:59.210 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.210 "is_configured": false, 00:18:59.210 "data_offset": 0, 00:18:59.210 "data_size": 0 00:18:59.210 } 00:18:59.210 ] 00:18:59.210 }' 00:18:59.210 22:47:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.210 22:47:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:59.774 22:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:00.032 [2024-07-15 22:47:44.801435] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:00.032 [2024-07-15 22:47:44.801608] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x198d350 00:19:00.032 [2024-07-15 22:47:44.801626] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:00.032 [2024-07-15 22:47:44.801800] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x198d020 00:19:00.032 [2024-07-15 22:47:44.801915] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x198d350 00:19:00.032 [2024-07-15 22:47:44.801935] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x198d350 00:19:00.032 [2024-07-15 22:47:44.802024] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:00.032 BaseBdev4 00:19:00.032 22:47:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:00.032 22:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:00.032 22:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:00.032 22:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:00.032 22:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:00.032 22:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:00.032 22:47:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:00.289 22:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:00.547 [ 00:19:00.547 { 00:19:00.547 "name": "BaseBdev4", 00:19:00.547 "aliases": [ 00:19:00.547 "a6290f0d-5e8c-492a-8197-8b4dddd69e7c" 00:19:00.547 ], 00:19:00.547 "product_name": "Malloc disk", 00:19:00.547 "block_size": 512, 00:19:00.547 "num_blocks": 65536, 00:19:00.547 "uuid": "a6290f0d-5e8c-492a-8197-8b4dddd69e7c", 00:19:00.547 "assigned_rate_limits": { 00:19:00.547 "rw_ios_per_sec": 0, 00:19:00.547 "rw_mbytes_per_sec": 0, 00:19:00.547 "r_mbytes_per_sec": 0, 00:19:00.547 "w_mbytes_per_sec": 0 00:19:00.547 }, 00:19:00.547 "claimed": true, 00:19:00.547 "claim_type": "exclusive_write", 00:19:00.547 "zoned": false, 00:19:00.547 "supported_io_types": { 00:19:00.547 "read": true, 00:19:00.547 "write": true, 00:19:00.547 "unmap": true, 00:19:00.547 "flush": true, 00:19:00.547 "reset": true, 00:19:00.547 "nvme_admin": false, 00:19:00.547 "nvme_io": false, 00:19:00.547 "nvme_io_md": false, 00:19:00.547 "write_zeroes": true, 00:19:00.547 "zcopy": true, 00:19:00.547 "get_zone_info": false, 00:19:00.547 "zone_management": false, 00:19:00.547 "zone_append": false, 00:19:00.547 "compare": false, 00:19:00.547 "compare_and_write": false, 00:19:00.547 "abort": true, 00:19:00.547 "seek_hole": false, 00:19:00.547 "seek_data": false, 00:19:00.547 "copy": true, 00:19:00.547 "nvme_iov_md": false 00:19:00.547 }, 00:19:00.547 "memory_domains": [ 00:19:00.547 { 00:19:00.547 "dma_device_id": "system", 00:19:00.547 "dma_device_type": 1 00:19:00.547 }, 00:19:00.547 { 00:19:00.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.547 "dma_device_type": 2 00:19:00.547 } 00:19:00.547 ], 00:19:00.547 "driver_specific": {} 00:19:00.547 } 00:19:00.547 ] 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.547 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:00.805 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:00.805 "name": "Existed_Raid", 00:19:00.805 "uuid": "28507820-993d-4d89-979b-45314c28ebef", 00:19:00.805 "strip_size_kb": 64, 00:19:00.805 "state": "online", 00:19:00.805 "raid_level": "raid0", 00:19:00.805 "superblock": true, 00:19:00.805 "num_base_bdevs": 4, 00:19:00.805 "num_base_bdevs_discovered": 4, 00:19:00.805 "num_base_bdevs_operational": 4, 00:19:00.805 "base_bdevs_list": [ 00:19:00.805 { 00:19:00.805 "name": "BaseBdev1", 00:19:00.805 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:19:00.805 "is_configured": true, 00:19:00.805 "data_offset": 2048, 00:19:00.805 "data_size": 63488 00:19:00.805 }, 00:19:00.805 { 00:19:00.805 "name": "BaseBdev2", 00:19:00.805 "uuid": "475fa52f-04a7-456c-8ecb-5e4bb0df78a3", 00:19:00.805 "is_configured": true, 00:19:00.805 "data_offset": 2048, 00:19:00.805 "data_size": 63488 00:19:00.805 }, 00:19:00.805 { 00:19:00.805 "name": "BaseBdev3", 00:19:00.805 "uuid": "514f1821-008d-4af7-99c7-0a633b63c864", 00:19:00.805 "is_configured": true, 00:19:00.805 "data_offset": 2048, 00:19:00.805 "data_size": 63488 00:19:00.805 }, 00:19:00.805 { 00:19:00.805 "name": "BaseBdev4", 00:19:00.805 "uuid": "a6290f0d-5e8c-492a-8197-8b4dddd69e7c", 00:19:00.805 "is_configured": true, 00:19:00.805 "data_offset": 2048, 00:19:00.805 "data_size": 63488 00:19:00.805 } 00:19:00.805 ] 00:19:00.805 }' 00:19:00.805 22:47:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:00.805 22:47:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:01.371 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:01.630 [2024-07-15 22:47:46.365918] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:01.630 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:01.630 "name": "Existed_Raid", 00:19:01.630 "aliases": [ 00:19:01.630 "28507820-993d-4d89-979b-45314c28ebef" 00:19:01.630 ], 00:19:01.630 "product_name": "Raid Volume", 00:19:01.630 "block_size": 512, 00:19:01.630 "num_blocks": 253952, 00:19:01.630 "uuid": "28507820-993d-4d89-979b-45314c28ebef", 00:19:01.630 "assigned_rate_limits": { 00:19:01.630 "rw_ios_per_sec": 0, 00:19:01.630 "rw_mbytes_per_sec": 0, 00:19:01.630 "r_mbytes_per_sec": 0, 00:19:01.630 "w_mbytes_per_sec": 0 00:19:01.630 }, 00:19:01.630 "claimed": false, 00:19:01.630 "zoned": false, 00:19:01.630 "supported_io_types": { 00:19:01.630 "read": true, 00:19:01.630 "write": true, 00:19:01.630 "unmap": true, 00:19:01.630 "flush": true, 00:19:01.630 "reset": true, 00:19:01.630 "nvme_admin": false, 00:19:01.630 "nvme_io": false, 00:19:01.630 "nvme_io_md": false, 00:19:01.630 "write_zeroes": true, 00:19:01.630 "zcopy": false, 00:19:01.630 "get_zone_info": false, 00:19:01.630 "zone_management": false, 00:19:01.630 "zone_append": false, 00:19:01.630 "compare": false, 00:19:01.630 "compare_and_write": false, 00:19:01.630 "abort": false, 00:19:01.630 "seek_hole": false, 00:19:01.630 "seek_data": false, 00:19:01.630 "copy": false, 00:19:01.630 "nvme_iov_md": false 00:19:01.630 }, 00:19:01.630 "memory_domains": [ 00:19:01.630 { 00:19:01.630 "dma_device_id": "system", 00:19:01.630 "dma_device_type": 1 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.630 "dma_device_type": 2 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "dma_device_id": "system", 00:19:01.630 "dma_device_type": 1 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.630 "dma_device_type": 2 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "dma_device_id": "system", 00:19:01.630 "dma_device_type": 1 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.630 "dma_device_type": 2 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "dma_device_id": "system", 00:19:01.630 "dma_device_type": 1 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.630 "dma_device_type": 2 00:19:01.630 } 00:19:01.630 ], 00:19:01.630 "driver_specific": { 00:19:01.630 "raid": { 00:19:01.630 "uuid": "28507820-993d-4d89-979b-45314c28ebef", 00:19:01.630 "strip_size_kb": 64, 00:19:01.630 "state": "online", 00:19:01.630 "raid_level": "raid0", 00:19:01.630 "superblock": true, 00:19:01.630 "num_base_bdevs": 4, 00:19:01.630 "num_base_bdevs_discovered": 4, 00:19:01.630 "num_base_bdevs_operational": 4, 00:19:01.630 "base_bdevs_list": [ 00:19:01.630 { 00:19:01.630 "name": "BaseBdev1", 00:19:01.630 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:19:01.630 "is_configured": true, 00:19:01.630 "data_offset": 2048, 00:19:01.630 "data_size": 63488 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "name": "BaseBdev2", 00:19:01.630 "uuid": "475fa52f-04a7-456c-8ecb-5e4bb0df78a3", 00:19:01.630 "is_configured": true, 00:19:01.630 "data_offset": 2048, 00:19:01.630 "data_size": 63488 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "name": "BaseBdev3", 00:19:01.630 "uuid": "514f1821-008d-4af7-99c7-0a633b63c864", 00:19:01.630 "is_configured": true, 00:19:01.630 "data_offset": 2048, 00:19:01.630 "data_size": 63488 00:19:01.630 }, 00:19:01.630 { 00:19:01.630 "name": "BaseBdev4", 00:19:01.630 "uuid": "a6290f0d-5e8c-492a-8197-8b4dddd69e7c", 00:19:01.630 "is_configured": true, 00:19:01.630 "data_offset": 2048, 00:19:01.630 "data_size": 63488 00:19:01.630 } 00:19:01.630 ] 00:19:01.630 } 00:19:01.630 } 00:19:01.630 }' 00:19:01.630 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:01.630 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:01.630 BaseBdev2 00:19:01.630 BaseBdev3 00:19:01.630 BaseBdev4' 00:19:01.630 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:01.631 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:01.631 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:01.889 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:01.889 "name": "BaseBdev1", 00:19:01.889 "aliases": [ 00:19:01.889 "b776235e-581a-435b-8ebf-0476258e037e" 00:19:01.889 ], 00:19:01.889 "product_name": "Malloc disk", 00:19:01.889 "block_size": 512, 00:19:01.889 "num_blocks": 65536, 00:19:01.889 "uuid": "b776235e-581a-435b-8ebf-0476258e037e", 00:19:01.889 "assigned_rate_limits": { 00:19:01.889 "rw_ios_per_sec": 0, 00:19:01.889 "rw_mbytes_per_sec": 0, 00:19:01.889 "r_mbytes_per_sec": 0, 00:19:01.889 "w_mbytes_per_sec": 0 00:19:01.889 }, 00:19:01.889 "claimed": true, 00:19:01.889 "claim_type": "exclusive_write", 00:19:01.889 "zoned": false, 00:19:01.889 "supported_io_types": { 00:19:01.889 "read": true, 00:19:01.889 "write": true, 00:19:01.889 "unmap": true, 00:19:01.889 "flush": true, 00:19:01.889 "reset": true, 00:19:01.889 "nvme_admin": false, 00:19:01.889 "nvme_io": false, 00:19:01.889 "nvme_io_md": false, 00:19:01.889 "write_zeroes": true, 00:19:01.889 "zcopy": true, 00:19:01.889 "get_zone_info": false, 00:19:01.889 "zone_management": false, 00:19:01.889 "zone_append": false, 00:19:01.889 "compare": false, 00:19:01.889 "compare_and_write": false, 00:19:01.889 "abort": true, 00:19:01.889 "seek_hole": false, 00:19:01.889 "seek_data": false, 00:19:01.889 "copy": true, 00:19:01.889 "nvme_iov_md": false 00:19:01.889 }, 00:19:01.889 "memory_domains": [ 00:19:01.889 { 00:19:01.889 "dma_device_id": "system", 00:19:01.889 "dma_device_type": 1 00:19:01.889 }, 00:19:01.889 { 00:19:01.889 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.889 "dma_device_type": 2 00:19:01.889 } 00:19:01.889 ], 00:19:01.889 "driver_specific": {} 00:19:01.889 }' 00:19:01.889 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.889 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.889 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:01.889 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.147 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.147 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:02.147 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.147 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.147 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:02.147 22:47:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.147 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.147 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:02.147 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:02.147 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:02.147 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:02.405 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:02.405 "name": "BaseBdev2", 00:19:02.405 "aliases": [ 00:19:02.405 "475fa52f-04a7-456c-8ecb-5e4bb0df78a3" 00:19:02.405 ], 00:19:02.405 "product_name": "Malloc disk", 00:19:02.405 "block_size": 512, 00:19:02.405 "num_blocks": 65536, 00:19:02.405 "uuid": "475fa52f-04a7-456c-8ecb-5e4bb0df78a3", 00:19:02.405 "assigned_rate_limits": { 00:19:02.405 "rw_ios_per_sec": 0, 00:19:02.405 "rw_mbytes_per_sec": 0, 00:19:02.405 "r_mbytes_per_sec": 0, 00:19:02.405 "w_mbytes_per_sec": 0 00:19:02.405 }, 00:19:02.405 "claimed": true, 00:19:02.405 "claim_type": "exclusive_write", 00:19:02.405 "zoned": false, 00:19:02.405 "supported_io_types": { 00:19:02.405 "read": true, 00:19:02.405 "write": true, 00:19:02.405 "unmap": true, 00:19:02.405 "flush": true, 00:19:02.405 "reset": true, 00:19:02.405 "nvme_admin": false, 00:19:02.405 "nvme_io": false, 00:19:02.405 "nvme_io_md": false, 00:19:02.405 "write_zeroes": true, 00:19:02.405 "zcopy": true, 00:19:02.405 "get_zone_info": false, 00:19:02.405 "zone_management": false, 00:19:02.405 "zone_append": false, 00:19:02.405 "compare": false, 00:19:02.405 "compare_and_write": false, 00:19:02.405 "abort": true, 00:19:02.405 "seek_hole": false, 00:19:02.405 "seek_data": false, 00:19:02.406 "copy": true, 00:19:02.406 "nvme_iov_md": false 00:19:02.406 }, 00:19:02.406 "memory_domains": [ 00:19:02.406 { 00:19:02.406 "dma_device_id": "system", 00:19:02.406 "dma_device_type": 1 00:19:02.406 }, 00:19:02.406 { 00:19:02.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.406 "dma_device_type": 2 00:19:02.406 } 00:19:02.406 ], 00:19:02.406 "driver_specific": {} 00:19:02.406 }' 00:19:02.406 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.664 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.664 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:02.664 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.664 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.664 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:02.664 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.664 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.924 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:02.924 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.924 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.924 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:02.924 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:02.924 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:02.924 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:03.184 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:03.184 "name": "BaseBdev3", 00:19:03.184 "aliases": [ 00:19:03.184 "514f1821-008d-4af7-99c7-0a633b63c864" 00:19:03.184 ], 00:19:03.184 "product_name": "Malloc disk", 00:19:03.184 "block_size": 512, 00:19:03.184 "num_blocks": 65536, 00:19:03.184 "uuid": "514f1821-008d-4af7-99c7-0a633b63c864", 00:19:03.184 "assigned_rate_limits": { 00:19:03.184 "rw_ios_per_sec": 0, 00:19:03.184 "rw_mbytes_per_sec": 0, 00:19:03.184 "r_mbytes_per_sec": 0, 00:19:03.184 "w_mbytes_per_sec": 0 00:19:03.184 }, 00:19:03.184 "claimed": true, 00:19:03.184 "claim_type": "exclusive_write", 00:19:03.184 "zoned": false, 00:19:03.184 "supported_io_types": { 00:19:03.184 "read": true, 00:19:03.184 "write": true, 00:19:03.184 "unmap": true, 00:19:03.184 "flush": true, 00:19:03.184 "reset": true, 00:19:03.184 "nvme_admin": false, 00:19:03.184 "nvme_io": false, 00:19:03.184 "nvme_io_md": false, 00:19:03.184 "write_zeroes": true, 00:19:03.184 "zcopy": true, 00:19:03.184 "get_zone_info": false, 00:19:03.184 "zone_management": false, 00:19:03.184 "zone_append": false, 00:19:03.184 "compare": false, 00:19:03.184 "compare_and_write": false, 00:19:03.184 "abort": true, 00:19:03.184 "seek_hole": false, 00:19:03.184 "seek_data": false, 00:19:03.184 "copy": true, 00:19:03.184 "nvme_iov_md": false 00:19:03.184 }, 00:19:03.184 "memory_domains": [ 00:19:03.184 { 00:19:03.184 "dma_device_id": "system", 00:19:03.184 "dma_device_type": 1 00:19:03.184 }, 00:19:03.184 { 00:19:03.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.184 "dma_device_type": 2 00:19:03.184 } 00:19:03.184 ], 00:19:03.184 "driver_specific": {} 00:19:03.184 }' 00:19:03.184 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.184 22:47:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.184 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:03.184 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.184 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:03.443 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:03.703 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:03.703 "name": "BaseBdev4", 00:19:03.703 "aliases": [ 00:19:03.703 "a6290f0d-5e8c-492a-8197-8b4dddd69e7c" 00:19:03.703 ], 00:19:03.703 "product_name": "Malloc disk", 00:19:03.703 "block_size": 512, 00:19:03.703 "num_blocks": 65536, 00:19:03.703 "uuid": "a6290f0d-5e8c-492a-8197-8b4dddd69e7c", 00:19:03.703 "assigned_rate_limits": { 00:19:03.703 "rw_ios_per_sec": 0, 00:19:03.703 "rw_mbytes_per_sec": 0, 00:19:03.703 "r_mbytes_per_sec": 0, 00:19:03.703 "w_mbytes_per_sec": 0 00:19:03.703 }, 00:19:03.703 "claimed": true, 00:19:03.703 "claim_type": "exclusive_write", 00:19:03.703 "zoned": false, 00:19:03.703 "supported_io_types": { 00:19:03.703 "read": true, 00:19:03.703 "write": true, 00:19:03.703 "unmap": true, 00:19:03.703 "flush": true, 00:19:03.703 "reset": true, 00:19:03.703 "nvme_admin": false, 00:19:03.703 "nvme_io": false, 00:19:03.703 "nvme_io_md": false, 00:19:03.703 "write_zeroes": true, 00:19:03.703 "zcopy": true, 00:19:03.703 "get_zone_info": false, 00:19:03.703 "zone_management": false, 00:19:03.703 "zone_append": false, 00:19:03.703 "compare": false, 00:19:03.703 "compare_and_write": false, 00:19:03.703 "abort": true, 00:19:03.703 "seek_hole": false, 00:19:03.703 "seek_data": false, 00:19:03.703 "copy": true, 00:19:03.703 "nvme_iov_md": false 00:19:03.703 }, 00:19:03.703 "memory_domains": [ 00:19:03.703 { 00:19:03.703 "dma_device_id": "system", 00:19:03.703 "dma_device_type": 1 00:19:03.703 }, 00:19:03.703 { 00:19:03.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.703 "dma_device_type": 2 00:19:03.703 } 00:19:03.703 ], 00:19:03.703 "driver_specific": {} 00:19:03.703 }' 00:19:03.703 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.703 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.994 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:04.252 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:04.252 22:47:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:04.252 [2024-07-15 22:47:49.157053] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:04.252 [2024-07-15 22:47:49.157079] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:04.252 [2024-07-15 22:47:49.157126] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:04.511 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.512 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.512 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.512 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.512 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.512 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.770 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.770 "name": "Existed_Raid", 00:19:04.770 "uuid": "28507820-993d-4d89-979b-45314c28ebef", 00:19:04.770 "strip_size_kb": 64, 00:19:04.770 "state": "offline", 00:19:04.770 "raid_level": "raid0", 00:19:04.770 "superblock": true, 00:19:04.770 "num_base_bdevs": 4, 00:19:04.770 "num_base_bdevs_discovered": 3, 00:19:04.770 "num_base_bdevs_operational": 3, 00:19:04.770 "base_bdevs_list": [ 00:19:04.770 { 00:19:04.770 "name": null, 00:19:04.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.770 "is_configured": false, 00:19:04.770 "data_offset": 2048, 00:19:04.770 "data_size": 63488 00:19:04.770 }, 00:19:04.770 { 00:19:04.770 "name": "BaseBdev2", 00:19:04.770 "uuid": "475fa52f-04a7-456c-8ecb-5e4bb0df78a3", 00:19:04.770 "is_configured": true, 00:19:04.770 "data_offset": 2048, 00:19:04.770 "data_size": 63488 00:19:04.770 }, 00:19:04.770 { 00:19:04.770 "name": "BaseBdev3", 00:19:04.770 "uuid": "514f1821-008d-4af7-99c7-0a633b63c864", 00:19:04.770 "is_configured": true, 00:19:04.770 "data_offset": 2048, 00:19:04.770 "data_size": 63488 00:19:04.770 }, 00:19:04.770 { 00:19:04.770 "name": "BaseBdev4", 00:19:04.770 "uuid": "a6290f0d-5e8c-492a-8197-8b4dddd69e7c", 00:19:04.770 "is_configured": true, 00:19:04.770 "data_offset": 2048, 00:19:04.770 "data_size": 63488 00:19:04.770 } 00:19:04.770 ] 00:19:04.770 }' 00:19:04.771 22:47:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.771 22:47:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:05.337 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:05.337 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:05.337 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.337 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:05.595 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:05.595 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:05.595 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:05.853 [2024-07-15 22:47:50.529825] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:05.853 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:05.853 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:05.853 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.853 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:06.111 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:06.111 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:06.111 22:47:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:06.369 [2024-07-15 22:47:51.035631] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:06.369 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:06.369 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:06.369 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.369 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:06.628 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:06.628 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:06.628 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:06.628 [2024-07-15 22:47:51.528818] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:06.628 [2024-07-15 22:47:51.528862] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x198d350 name Existed_Raid, state offline 00:19:06.886 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:06.886 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:06.886 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:06.886 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.145 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:07.145 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:07.145 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:07.145 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:07.145 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:07.145 22:47:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:07.403 BaseBdev2 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:07.662 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:07.920 [ 00:19:07.920 { 00:19:07.920 "name": "BaseBdev2", 00:19:07.920 "aliases": [ 00:19:07.920 "8f61b86a-6132-4cba-999e-829d22d0ed68" 00:19:07.920 ], 00:19:07.920 "product_name": "Malloc disk", 00:19:07.920 "block_size": 512, 00:19:07.920 "num_blocks": 65536, 00:19:07.920 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:07.920 "assigned_rate_limits": { 00:19:07.920 "rw_ios_per_sec": 0, 00:19:07.920 "rw_mbytes_per_sec": 0, 00:19:07.920 "r_mbytes_per_sec": 0, 00:19:07.920 "w_mbytes_per_sec": 0 00:19:07.920 }, 00:19:07.920 "claimed": false, 00:19:07.920 "zoned": false, 00:19:07.920 "supported_io_types": { 00:19:07.920 "read": true, 00:19:07.920 "write": true, 00:19:07.920 "unmap": true, 00:19:07.920 "flush": true, 00:19:07.920 "reset": true, 00:19:07.920 "nvme_admin": false, 00:19:07.920 "nvme_io": false, 00:19:07.920 "nvme_io_md": false, 00:19:07.920 "write_zeroes": true, 00:19:07.920 "zcopy": true, 00:19:07.920 "get_zone_info": false, 00:19:07.921 "zone_management": false, 00:19:07.921 "zone_append": false, 00:19:07.921 "compare": false, 00:19:07.921 "compare_and_write": false, 00:19:07.921 "abort": true, 00:19:07.921 "seek_hole": false, 00:19:07.921 "seek_data": false, 00:19:07.921 "copy": true, 00:19:07.921 "nvme_iov_md": false 00:19:07.921 }, 00:19:07.921 "memory_domains": [ 00:19:07.921 { 00:19:07.921 "dma_device_id": "system", 00:19:07.921 "dma_device_type": 1 00:19:07.921 }, 00:19:07.921 { 00:19:07.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.921 "dma_device_type": 2 00:19:07.921 } 00:19:07.921 ], 00:19:07.921 "driver_specific": {} 00:19:07.921 } 00:19:07.921 ] 00:19:07.921 22:47:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:07.921 22:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:07.921 22:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:07.921 22:47:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:08.489 BaseBdev3 00:19:08.489 22:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:08.489 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:08.489 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:08.489 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:08.489 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:08.489 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:08.489 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:08.748 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:09.005 [ 00:19:09.005 { 00:19:09.005 "name": "BaseBdev3", 00:19:09.005 "aliases": [ 00:19:09.005 "e7ae74c5-66be-4eaa-a632-9c1295830def" 00:19:09.005 ], 00:19:09.005 "product_name": "Malloc disk", 00:19:09.005 "block_size": 512, 00:19:09.005 "num_blocks": 65536, 00:19:09.005 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:09.005 "assigned_rate_limits": { 00:19:09.005 "rw_ios_per_sec": 0, 00:19:09.005 "rw_mbytes_per_sec": 0, 00:19:09.005 "r_mbytes_per_sec": 0, 00:19:09.005 "w_mbytes_per_sec": 0 00:19:09.005 }, 00:19:09.005 "claimed": false, 00:19:09.005 "zoned": false, 00:19:09.005 "supported_io_types": { 00:19:09.005 "read": true, 00:19:09.005 "write": true, 00:19:09.005 "unmap": true, 00:19:09.005 "flush": true, 00:19:09.005 "reset": true, 00:19:09.005 "nvme_admin": false, 00:19:09.005 "nvme_io": false, 00:19:09.005 "nvme_io_md": false, 00:19:09.005 "write_zeroes": true, 00:19:09.005 "zcopy": true, 00:19:09.005 "get_zone_info": false, 00:19:09.005 "zone_management": false, 00:19:09.005 "zone_append": false, 00:19:09.005 "compare": false, 00:19:09.005 "compare_and_write": false, 00:19:09.005 "abort": true, 00:19:09.005 "seek_hole": false, 00:19:09.005 "seek_data": false, 00:19:09.005 "copy": true, 00:19:09.005 "nvme_iov_md": false 00:19:09.005 }, 00:19:09.005 "memory_domains": [ 00:19:09.005 { 00:19:09.005 "dma_device_id": "system", 00:19:09.005 "dma_device_type": 1 00:19:09.005 }, 00:19:09.005 { 00:19:09.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.005 "dma_device_type": 2 00:19:09.005 } 00:19:09.005 ], 00:19:09.005 "driver_specific": {} 00:19:09.005 } 00:19:09.005 ] 00:19:09.005 22:47:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:09.005 22:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:09.005 22:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:09.005 22:47:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:09.262 BaseBdev4 00:19:09.262 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:09.262 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:09.262 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:09.262 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:09.262 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:09.262 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:09.262 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:09.519 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:09.776 [ 00:19:09.776 { 00:19:09.776 "name": "BaseBdev4", 00:19:09.776 "aliases": [ 00:19:09.776 "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d" 00:19:09.776 ], 00:19:09.776 "product_name": "Malloc disk", 00:19:09.776 "block_size": 512, 00:19:09.776 "num_blocks": 65536, 00:19:09.776 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:09.776 "assigned_rate_limits": { 00:19:09.776 "rw_ios_per_sec": 0, 00:19:09.776 "rw_mbytes_per_sec": 0, 00:19:09.776 "r_mbytes_per_sec": 0, 00:19:09.776 "w_mbytes_per_sec": 0 00:19:09.776 }, 00:19:09.776 "claimed": false, 00:19:09.776 "zoned": false, 00:19:09.776 "supported_io_types": { 00:19:09.776 "read": true, 00:19:09.776 "write": true, 00:19:09.776 "unmap": true, 00:19:09.776 "flush": true, 00:19:09.776 "reset": true, 00:19:09.776 "nvme_admin": false, 00:19:09.776 "nvme_io": false, 00:19:09.776 "nvme_io_md": false, 00:19:09.776 "write_zeroes": true, 00:19:09.776 "zcopy": true, 00:19:09.776 "get_zone_info": false, 00:19:09.776 "zone_management": false, 00:19:09.776 "zone_append": false, 00:19:09.776 "compare": false, 00:19:09.776 "compare_and_write": false, 00:19:09.776 "abort": true, 00:19:09.776 "seek_hole": false, 00:19:09.776 "seek_data": false, 00:19:09.776 "copy": true, 00:19:09.776 "nvme_iov_md": false 00:19:09.776 }, 00:19:09.776 "memory_domains": [ 00:19:09.776 { 00:19:09.776 "dma_device_id": "system", 00:19:09.776 "dma_device_type": 1 00:19:09.776 }, 00:19:09.776 { 00:19:09.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.776 "dma_device_type": 2 00:19:09.776 } 00:19:09.776 ], 00:19:09.776 "driver_specific": {} 00:19:09.776 } 00:19:09.776 ] 00:19:09.776 22:47:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:09.776 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:09.776 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:09.776 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:10.034 [2024-07-15 22:47:54.774793] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:10.034 [2024-07-15 22:47:54.774831] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:10.034 [2024-07-15 22:47:54.774849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:10.034 [2024-07-15 22:47:54.776170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:10.034 [2024-07-15 22:47:54.776211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.034 22:47:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:10.292 22:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:10.292 "name": "Existed_Raid", 00:19:10.292 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:10.292 "strip_size_kb": 64, 00:19:10.292 "state": "configuring", 00:19:10.292 "raid_level": "raid0", 00:19:10.292 "superblock": true, 00:19:10.292 "num_base_bdevs": 4, 00:19:10.292 "num_base_bdevs_discovered": 3, 00:19:10.292 "num_base_bdevs_operational": 4, 00:19:10.292 "base_bdevs_list": [ 00:19:10.292 { 00:19:10.292 "name": "BaseBdev1", 00:19:10.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:10.292 "is_configured": false, 00:19:10.292 "data_offset": 0, 00:19:10.292 "data_size": 0 00:19:10.292 }, 00:19:10.292 { 00:19:10.292 "name": "BaseBdev2", 00:19:10.292 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:10.292 "is_configured": true, 00:19:10.292 "data_offset": 2048, 00:19:10.292 "data_size": 63488 00:19:10.292 }, 00:19:10.292 { 00:19:10.292 "name": "BaseBdev3", 00:19:10.292 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:10.292 "is_configured": true, 00:19:10.292 "data_offset": 2048, 00:19:10.292 "data_size": 63488 00:19:10.292 }, 00:19:10.292 { 00:19:10.292 "name": "BaseBdev4", 00:19:10.292 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:10.292 "is_configured": true, 00:19:10.292 "data_offset": 2048, 00:19:10.292 "data_size": 63488 00:19:10.292 } 00:19:10.292 ] 00:19:10.292 }' 00:19:10.292 22:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:10.292 22:47:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:11.226 22:47:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:11.226 [2024-07-15 22:47:56.122349] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:11.485 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.758 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.758 "name": "Existed_Raid", 00:19:11.758 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:11.758 "strip_size_kb": 64, 00:19:11.758 "state": "configuring", 00:19:11.758 "raid_level": "raid0", 00:19:11.758 "superblock": true, 00:19:11.758 "num_base_bdevs": 4, 00:19:11.758 "num_base_bdevs_discovered": 2, 00:19:11.758 "num_base_bdevs_operational": 4, 00:19:11.758 "base_bdevs_list": [ 00:19:11.758 { 00:19:11.758 "name": "BaseBdev1", 00:19:11.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.758 "is_configured": false, 00:19:11.758 "data_offset": 0, 00:19:11.758 "data_size": 0 00:19:11.758 }, 00:19:11.758 { 00:19:11.758 "name": null, 00:19:11.758 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:11.758 "is_configured": false, 00:19:11.758 "data_offset": 2048, 00:19:11.758 "data_size": 63488 00:19:11.758 }, 00:19:11.758 { 00:19:11.758 "name": "BaseBdev3", 00:19:11.758 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:11.758 "is_configured": true, 00:19:11.758 "data_offset": 2048, 00:19:11.758 "data_size": 63488 00:19:11.758 }, 00:19:11.758 { 00:19:11.758 "name": "BaseBdev4", 00:19:11.758 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:11.758 "is_configured": true, 00:19:11.758 "data_offset": 2048, 00:19:11.758 "data_size": 63488 00:19:11.758 } 00:19:11.758 ] 00:19:11.758 }' 00:19:11.758 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.758 22:47:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:12.326 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.326 22:47:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:12.326 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:12.326 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:12.585 [2024-07-15 22:47:57.421257] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:12.585 BaseBdev1 00:19:12.585 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:12.585 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:12.585 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:12.585 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:12.585 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:12.585 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:12.585 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:12.843 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:13.102 [ 00:19:13.102 { 00:19:13.102 "name": "BaseBdev1", 00:19:13.102 "aliases": [ 00:19:13.102 "ba8e9f03-3825-441f-898b-f06be53a65c1" 00:19:13.102 ], 00:19:13.102 "product_name": "Malloc disk", 00:19:13.102 "block_size": 512, 00:19:13.102 "num_blocks": 65536, 00:19:13.102 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:13.102 "assigned_rate_limits": { 00:19:13.102 "rw_ios_per_sec": 0, 00:19:13.102 "rw_mbytes_per_sec": 0, 00:19:13.102 "r_mbytes_per_sec": 0, 00:19:13.102 "w_mbytes_per_sec": 0 00:19:13.102 }, 00:19:13.102 "claimed": true, 00:19:13.102 "claim_type": "exclusive_write", 00:19:13.102 "zoned": false, 00:19:13.102 "supported_io_types": { 00:19:13.102 "read": true, 00:19:13.102 "write": true, 00:19:13.102 "unmap": true, 00:19:13.102 "flush": true, 00:19:13.102 "reset": true, 00:19:13.102 "nvme_admin": false, 00:19:13.102 "nvme_io": false, 00:19:13.102 "nvme_io_md": false, 00:19:13.102 "write_zeroes": true, 00:19:13.102 "zcopy": true, 00:19:13.102 "get_zone_info": false, 00:19:13.102 "zone_management": false, 00:19:13.102 "zone_append": false, 00:19:13.102 "compare": false, 00:19:13.102 "compare_and_write": false, 00:19:13.102 "abort": true, 00:19:13.102 "seek_hole": false, 00:19:13.102 "seek_data": false, 00:19:13.102 "copy": true, 00:19:13.103 "nvme_iov_md": false 00:19:13.103 }, 00:19:13.103 "memory_domains": [ 00:19:13.103 { 00:19:13.103 "dma_device_id": "system", 00:19:13.103 "dma_device_type": 1 00:19:13.103 }, 00:19:13.103 { 00:19:13.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.103 "dma_device_type": 2 00:19:13.103 } 00:19:13.103 ], 00:19:13.103 "driver_specific": {} 00:19:13.103 } 00:19:13.103 ] 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.103 22:47:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:13.362 22:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:13.362 "name": "Existed_Raid", 00:19:13.362 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:13.362 "strip_size_kb": 64, 00:19:13.362 "state": "configuring", 00:19:13.362 "raid_level": "raid0", 00:19:13.362 "superblock": true, 00:19:13.362 "num_base_bdevs": 4, 00:19:13.362 "num_base_bdevs_discovered": 3, 00:19:13.362 "num_base_bdevs_operational": 4, 00:19:13.362 "base_bdevs_list": [ 00:19:13.362 { 00:19:13.362 "name": "BaseBdev1", 00:19:13.362 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:13.362 "is_configured": true, 00:19:13.362 "data_offset": 2048, 00:19:13.362 "data_size": 63488 00:19:13.362 }, 00:19:13.362 { 00:19:13.362 "name": null, 00:19:13.362 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:13.362 "is_configured": false, 00:19:13.362 "data_offset": 2048, 00:19:13.362 "data_size": 63488 00:19:13.362 }, 00:19:13.362 { 00:19:13.362 "name": "BaseBdev3", 00:19:13.362 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:13.362 "is_configured": true, 00:19:13.362 "data_offset": 2048, 00:19:13.362 "data_size": 63488 00:19:13.362 }, 00:19:13.362 { 00:19:13.362 "name": "BaseBdev4", 00:19:13.362 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:13.362 "is_configured": true, 00:19:13.362 "data_offset": 2048, 00:19:13.362 "data_size": 63488 00:19:13.362 } 00:19:13.362 ] 00:19:13.362 }' 00:19:13.362 22:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:13.362 22:47:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:13.928 22:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:13.928 22:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.187 22:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:14.187 22:47:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:14.446 [2024-07-15 22:47:59.185986] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.446 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.704 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.704 "name": "Existed_Raid", 00:19:14.704 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:14.704 "strip_size_kb": 64, 00:19:14.704 "state": "configuring", 00:19:14.704 "raid_level": "raid0", 00:19:14.704 "superblock": true, 00:19:14.704 "num_base_bdevs": 4, 00:19:14.704 "num_base_bdevs_discovered": 2, 00:19:14.704 "num_base_bdevs_operational": 4, 00:19:14.704 "base_bdevs_list": [ 00:19:14.704 { 00:19:14.704 "name": "BaseBdev1", 00:19:14.704 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:14.704 "is_configured": true, 00:19:14.704 "data_offset": 2048, 00:19:14.704 "data_size": 63488 00:19:14.704 }, 00:19:14.704 { 00:19:14.704 "name": null, 00:19:14.704 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:14.704 "is_configured": false, 00:19:14.704 "data_offset": 2048, 00:19:14.704 "data_size": 63488 00:19:14.704 }, 00:19:14.704 { 00:19:14.704 "name": null, 00:19:14.704 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:14.704 "is_configured": false, 00:19:14.704 "data_offset": 2048, 00:19:14.704 "data_size": 63488 00:19:14.704 }, 00:19:14.704 { 00:19:14.704 "name": "BaseBdev4", 00:19:14.704 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:14.704 "is_configured": true, 00:19:14.704 "data_offset": 2048, 00:19:14.704 "data_size": 63488 00:19:14.704 } 00:19:14.704 ] 00:19:14.704 }' 00:19:14.704 22:47:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.704 22:47:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:15.270 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.270 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:15.528 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:15.528 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:15.787 [2024-07-15 22:48:00.493479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.787 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:16.045 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.045 "name": "Existed_Raid", 00:19:16.045 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:16.045 "strip_size_kb": 64, 00:19:16.045 "state": "configuring", 00:19:16.045 "raid_level": "raid0", 00:19:16.045 "superblock": true, 00:19:16.045 "num_base_bdevs": 4, 00:19:16.045 "num_base_bdevs_discovered": 3, 00:19:16.045 "num_base_bdevs_operational": 4, 00:19:16.045 "base_bdevs_list": [ 00:19:16.045 { 00:19:16.045 "name": "BaseBdev1", 00:19:16.045 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:16.045 "is_configured": true, 00:19:16.045 "data_offset": 2048, 00:19:16.045 "data_size": 63488 00:19:16.045 }, 00:19:16.045 { 00:19:16.045 "name": null, 00:19:16.045 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:16.045 "is_configured": false, 00:19:16.045 "data_offset": 2048, 00:19:16.045 "data_size": 63488 00:19:16.045 }, 00:19:16.045 { 00:19:16.045 "name": "BaseBdev3", 00:19:16.045 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:16.045 "is_configured": true, 00:19:16.045 "data_offset": 2048, 00:19:16.045 "data_size": 63488 00:19:16.045 }, 00:19:16.045 { 00:19:16.045 "name": "BaseBdev4", 00:19:16.045 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:16.045 "is_configured": true, 00:19:16.045 "data_offset": 2048, 00:19:16.045 "data_size": 63488 00:19:16.045 } 00:19:16.045 ] 00:19:16.045 }' 00:19:16.045 22:48:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.045 22:48:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.612 22:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.612 22:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:17.178 22:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:17.179 22:48:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:17.443 [2024-07-15 22:48:02.089754] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.443 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.703 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.703 "name": "Existed_Raid", 00:19:17.703 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:17.703 "strip_size_kb": 64, 00:19:17.703 "state": "configuring", 00:19:17.703 "raid_level": "raid0", 00:19:17.703 "superblock": true, 00:19:17.703 "num_base_bdevs": 4, 00:19:17.704 "num_base_bdevs_discovered": 2, 00:19:17.704 "num_base_bdevs_operational": 4, 00:19:17.704 "base_bdevs_list": [ 00:19:17.704 { 00:19:17.704 "name": null, 00:19:17.704 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:17.704 "is_configured": false, 00:19:17.704 "data_offset": 2048, 00:19:17.704 "data_size": 63488 00:19:17.704 }, 00:19:17.704 { 00:19:17.704 "name": null, 00:19:17.704 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:17.704 "is_configured": false, 00:19:17.704 "data_offset": 2048, 00:19:17.704 "data_size": 63488 00:19:17.704 }, 00:19:17.704 { 00:19:17.704 "name": "BaseBdev3", 00:19:17.704 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:17.704 "is_configured": true, 00:19:17.704 "data_offset": 2048, 00:19:17.704 "data_size": 63488 00:19:17.704 }, 00:19:17.704 { 00:19:17.704 "name": "BaseBdev4", 00:19:17.704 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:17.704 "is_configured": true, 00:19:17.704 "data_offset": 2048, 00:19:17.704 "data_size": 63488 00:19:17.704 } 00:19:17.704 ] 00:19:17.704 }' 00:19:17.704 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.704 22:48:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.270 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.270 22:48:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:18.528 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:18.528 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:18.528 [2024-07-15 22:48:03.425736] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.787 "name": "Existed_Raid", 00:19:18.787 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:18.787 "strip_size_kb": 64, 00:19:18.787 "state": "configuring", 00:19:18.787 "raid_level": "raid0", 00:19:18.787 "superblock": true, 00:19:18.787 "num_base_bdevs": 4, 00:19:18.787 "num_base_bdevs_discovered": 3, 00:19:18.787 "num_base_bdevs_operational": 4, 00:19:18.787 "base_bdevs_list": [ 00:19:18.787 { 00:19:18.787 "name": null, 00:19:18.787 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:18.787 "is_configured": false, 00:19:18.787 "data_offset": 2048, 00:19:18.787 "data_size": 63488 00:19:18.787 }, 00:19:18.787 { 00:19:18.787 "name": "BaseBdev2", 00:19:18.787 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:18.787 "is_configured": true, 00:19:18.787 "data_offset": 2048, 00:19:18.787 "data_size": 63488 00:19:18.787 }, 00:19:18.787 { 00:19:18.787 "name": "BaseBdev3", 00:19:18.787 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:18.787 "is_configured": true, 00:19:18.787 "data_offset": 2048, 00:19:18.787 "data_size": 63488 00:19:18.787 }, 00:19:18.787 { 00:19:18.787 "name": "BaseBdev4", 00:19:18.787 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:18.787 "is_configured": true, 00:19:18.787 "data_offset": 2048, 00:19:18.787 "data_size": 63488 00:19:18.787 } 00:19:18.787 ] 00:19:18.787 }' 00:19:18.787 22:48:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.045 22:48:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.979 22:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.979 22:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:19.979 22:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:19.979 22:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.979 22:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:20.238 22:48:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ba8e9f03-3825-441f-898b-f06be53a65c1 00:19:20.496 [2024-07-15 22:48:05.229901] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:20.496 [2024-07-15 22:48:05.230078] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1993470 00:19:20.496 [2024-07-15 22:48:05.230091] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:20.496 [2024-07-15 22:48:05.230270] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1983c40 00:19:20.496 [2024-07-15 22:48:05.230387] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1993470 00:19:20.496 [2024-07-15 22:48:05.230397] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1993470 00:19:20.496 [2024-07-15 22:48:05.230487] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:20.496 NewBaseBdev 00:19:20.496 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:20.496 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:20.496 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:20.496 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:20.496 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:20.496 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:20.496 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:20.787 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:21.078 [ 00:19:21.078 { 00:19:21.078 "name": "NewBaseBdev", 00:19:21.078 "aliases": [ 00:19:21.078 "ba8e9f03-3825-441f-898b-f06be53a65c1" 00:19:21.078 ], 00:19:21.078 "product_name": "Malloc disk", 00:19:21.078 "block_size": 512, 00:19:21.078 "num_blocks": 65536, 00:19:21.078 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:21.078 "assigned_rate_limits": { 00:19:21.078 "rw_ios_per_sec": 0, 00:19:21.078 "rw_mbytes_per_sec": 0, 00:19:21.078 "r_mbytes_per_sec": 0, 00:19:21.078 "w_mbytes_per_sec": 0 00:19:21.078 }, 00:19:21.078 "claimed": true, 00:19:21.078 "claim_type": "exclusive_write", 00:19:21.078 "zoned": false, 00:19:21.078 "supported_io_types": { 00:19:21.078 "read": true, 00:19:21.078 "write": true, 00:19:21.078 "unmap": true, 00:19:21.078 "flush": true, 00:19:21.078 "reset": true, 00:19:21.078 "nvme_admin": false, 00:19:21.078 "nvme_io": false, 00:19:21.078 "nvme_io_md": false, 00:19:21.078 "write_zeroes": true, 00:19:21.078 "zcopy": true, 00:19:21.078 "get_zone_info": false, 00:19:21.078 "zone_management": false, 00:19:21.078 "zone_append": false, 00:19:21.078 "compare": false, 00:19:21.078 "compare_and_write": false, 00:19:21.078 "abort": true, 00:19:21.078 "seek_hole": false, 00:19:21.078 "seek_data": false, 00:19:21.078 "copy": true, 00:19:21.078 "nvme_iov_md": false 00:19:21.078 }, 00:19:21.078 "memory_domains": [ 00:19:21.078 { 00:19:21.078 "dma_device_id": "system", 00:19:21.078 "dma_device_type": 1 00:19:21.078 }, 00:19:21.078 { 00:19:21.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.078 "dma_device_type": 2 00:19:21.078 } 00:19:21.078 ], 00:19:21.078 "driver_specific": {} 00:19:21.078 } 00:19:21.078 ] 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.078 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.078 "name": "Existed_Raid", 00:19:21.078 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:21.078 "strip_size_kb": 64, 00:19:21.078 "state": "online", 00:19:21.078 "raid_level": "raid0", 00:19:21.078 "superblock": true, 00:19:21.078 "num_base_bdevs": 4, 00:19:21.078 "num_base_bdevs_discovered": 4, 00:19:21.078 "num_base_bdevs_operational": 4, 00:19:21.078 "base_bdevs_list": [ 00:19:21.078 { 00:19:21.078 "name": "NewBaseBdev", 00:19:21.078 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:21.078 "is_configured": true, 00:19:21.078 "data_offset": 2048, 00:19:21.078 "data_size": 63488 00:19:21.078 }, 00:19:21.078 { 00:19:21.078 "name": "BaseBdev2", 00:19:21.078 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:21.078 "is_configured": true, 00:19:21.078 "data_offset": 2048, 00:19:21.079 "data_size": 63488 00:19:21.079 }, 00:19:21.079 { 00:19:21.079 "name": "BaseBdev3", 00:19:21.079 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:21.079 "is_configured": true, 00:19:21.079 "data_offset": 2048, 00:19:21.079 "data_size": 63488 00:19:21.079 }, 00:19:21.079 { 00:19:21.079 "name": "BaseBdev4", 00:19:21.079 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:21.079 "is_configured": true, 00:19:21.079 "data_offset": 2048, 00:19:21.079 "data_size": 63488 00:19:21.079 } 00:19:21.079 ] 00:19:21.079 }' 00:19:21.079 22:48:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.079 22:48:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:22.012 [2024-07-15 22:48:06.826473] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:22.012 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:22.012 "name": "Existed_Raid", 00:19:22.012 "aliases": [ 00:19:22.012 "5c413018-05d7-4b71-8349-0a6e21968374" 00:19:22.012 ], 00:19:22.012 "product_name": "Raid Volume", 00:19:22.012 "block_size": 512, 00:19:22.012 "num_blocks": 253952, 00:19:22.012 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:22.012 "assigned_rate_limits": { 00:19:22.012 "rw_ios_per_sec": 0, 00:19:22.012 "rw_mbytes_per_sec": 0, 00:19:22.012 "r_mbytes_per_sec": 0, 00:19:22.012 "w_mbytes_per_sec": 0 00:19:22.012 }, 00:19:22.012 "claimed": false, 00:19:22.012 "zoned": false, 00:19:22.012 "supported_io_types": { 00:19:22.012 "read": true, 00:19:22.012 "write": true, 00:19:22.012 "unmap": true, 00:19:22.012 "flush": true, 00:19:22.012 "reset": true, 00:19:22.012 "nvme_admin": false, 00:19:22.012 "nvme_io": false, 00:19:22.012 "nvme_io_md": false, 00:19:22.012 "write_zeroes": true, 00:19:22.012 "zcopy": false, 00:19:22.012 "get_zone_info": false, 00:19:22.012 "zone_management": false, 00:19:22.012 "zone_append": false, 00:19:22.012 "compare": false, 00:19:22.012 "compare_and_write": false, 00:19:22.012 "abort": false, 00:19:22.012 "seek_hole": false, 00:19:22.012 "seek_data": false, 00:19:22.012 "copy": false, 00:19:22.012 "nvme_iov_md": false 00:19:22.012 }, 00:19:22.012 "memory_domains": [ 00:19:22.012 { 00:19:22.012 "dma_device_id": "system", 00:19:22.012 "dma_device_type": 1 00:19:22.012 }, 00:19:22.012 { 00:19:22.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.012 "dma_device_type": 2 00:19:22.012 }, 00:19:22.012 { 00:19:22.012 "dma_device_id": "system", 00:19:22.012 "dma_device_type": 1 00:19:22.012 }, 00:19:22.012 { 00:19:22.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.013 "dma_device_type": 2 00:19:22.013 }, 00:19:22.013 { 00:19:22.013 "dma_device_id": "system", 00:19:22.013 "dma_device_type": 1 00:19:22.013 }, 00:19:22.013 { 00:19:22.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.013 "dma_device_type": 2 00:19:22.013 }, 00:19:22.013 { 00:19:22.013 "dma_device_id": "system", 00:19:22.013 "dma_device_type": 1 00:19:22.013 }, 00:19:22.013 { 00:19:22.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.013 "dma_device_type": 2 00:19:22.013 } 00:19:22.013 ], 00:19:22.013 "driver_specific": { 00:19:22.013 "raid": { 00:19:22.013 "uuid": "5c413018-05d7-4b71-8349-0a6e21968374", 00:19:22.013 "strip_size_kb": 64, 00:19:22.013 "state": "online", 00:19:22.013 "raid_level": "raid0", 00:19:22.013 "superblock": true, 00:19:22.013 "num_base_bdevs": 4, 00:19:22.013 "num_base_bdevs_discovered": 4, 00:19:22.013 "num_base_bdevs_operational": 4, 00:19:22.013 "base_bdevs_list": [ 00:19:22.013 { 00:19:22.013 "name": "NewBaseBdev", 00:19:22.013 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:22.013 "is_configured": true, 00:19:22.013 "data_offset": 2048, 00:19:22.013 "data_size": 63488 00:19:22.013 }, 00:19:22.013 { 00:19:22.013 "name": "BaseBdev2", 00:19:22.013 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:22.013 "is_configured": true, 00:19:22.013 "data_offset": 2048, 00:19:22.013 "data_size": 63488 00:19:22.013 }, 00:19:22.013 { 00:19:22.013 "name": "BaseBdev3", 00:19:22.013 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:22.013 "is_configured": true, 00:19:22.013 "data_offset": 2048, 00:19:22.013 "data_size": 63488 00:19:22.013 }, 00:19:22.013 { 00:19:22.013 "name": "BaseBdev4", 00:19:22.013 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:22.013 "is_configured": true, 00:19:22.013 "data_offset": 2048, 00:19:22.013 "data_size": 63488 00:19:22.013 } 00:19:22.013 ] 00:19:22.013 } 00:19:22.013 } 00:19:22.013 }' 00:19:22.013 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:22.013 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:22.013 BaseBdev2 00:19:22.013 BaseBdev3 00:19:22.013 BaseBdev4' 00:19:22.013 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:22.013 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:22.013 22:48:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:22.269 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:22.269 "name": "NewBaseBdev", 00:19:22.269 "aliases": [ 00:19:22.269 "ba8e9f03-3825-441f-898b-f06be53a65c1" 00:19:22.269 ], 00:19:22.269 "product_name": "Malloc disk", 00:19:22.269 "block_size": 512, 00:19:22.269 "num_blocks": 65536, 00:19:22.269 "uuid": "ba8e9f03-3825-441f-898b-f06be53a65c1", 00:19:22.269 "assigned_rate_limits": { 00:19:22.269 "rw_ios_per_sec": 0, 00:19:22.269 "rw_mbytes_per_sec": 0, 00:19:22.269 "r_mbytes_per_sec": 0, 00:19:22.269 "w_mbytes_per_sec": 0 00:19:22.269 }, 00:19:22.269 "claimed": true, 00:19:22.269 "claim_type": "exclusive_write", 00:19:22.269 "zoned": false, 00:19:22.269 "supported_io_types": { 00:19:22.269 "read": true, 00:19:22.269 "write": true, 00:19:22.269 "unmap": true, 00:19:22.269 "flush": true, 00:19:22.269 "reset": true, 00:19:22.269 "nvme_admin": false, 00:19:22.269 "nvme_io": false, 00:19:22.269 "nvme_io_md": false, 00:19:22.269 "write_zeroes": true, 00:19:22.269 "zcopy": true, 00:19:22.269 "get_zone_info": false, 00:19:22.269 "zone_management": false, 00:19:22.269 "zone_append": false, 00:19:22.269 "compare": false, 00:19:22.269 "compare_and_write": false, 00:19:22.269 "abort": true, 00:19:22.269 "seek_hole": false, 00:19:22.270 "seek_data": false, 00:19:22.270 "copy": true, 00:19:22.270 "nvme_iov_md": false 00:19:22.270 }, 00:19:22.270 "memory_domains": [ 00:19:22.270 { 00:19:22.270 "dma_device_id": "system", 00:19:22.270 "dma_device_type": 1 00:19:22.270 }, 00:19:22.270 { 00:19:22.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.270 "dma_device_type": 2 00:19:22.270 } 00:19:22.270 ], 00:19:22.270 "driver_specific": {} 00:19:22.270 }' 00:19:22.270 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:22.527 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.784 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:22.784 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:22.784 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:22.784 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:22.784 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:23.041 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:23.041 "name": "BaseBdev2", 00:19:23.041 "aliases": [ 00:19:23.041 "8f61b86a-6132-4cba-999e-829d22d0ed68" 00:19:23.041 ], 00:19:23.041 "product_name": "Malloc disk", 00:19:23.041 "block_size": 512, 00:19:23.041 "num_blocks": 65536, 00:19:23.041 "uuid": "8f61b86a-6132-4cba-999e-829d22d0ed68", 00:19:23.041 "assigned_rate_limits": { 00:19:23.041 "rw_ios_per_sec": 0, 00:19:23.041 "rw_mbytes_per_sec": 0, 00:19:23.041 "r_mbytes_per_sec": 0, 00:19:23.041 "w_mbytes_per_sec": 0 00:19:23.041 }, 00:19:23.041 "claimed": true, 00:19:23.041 "claim_type": "exclusive_write", 00:19:23.041 "zoned": false, 00:19:23.041 "supported_io_types": { 00:19:23.041 "read": true, 00:19:23.041 "write": true, 00:19:23.041 "unmap": true, 00:19:23.041 "flush": true, 00:19:23.041 "reset": true, 00:19:23.041 "nvme_admin": false, 00:19:23.041 "nvme_io": false, 00:19:23.041 "nvme_io_md": false, 00:19:23.041 "write_zeroes": true, 00:19:23.041 "zcopy": true, 00:19:23.041 "get_zone_info": false, 00:19:23.041 "zone_management": false, 00:19:23.041 "zone_append": false, 00:19:23.041 "compare": false, 00:19:23.041 "compare_and_write": false, 00:19:23.041 "abort": true, 00:19:23.041 "seek_hole": false, 00:19:23.041 "seek_data": false, 00:19:23.041 "copy": true, 00:19:23.041 "nvme_iov_md": false 00:19:23.041 }, 00:19:23.041 "memory_domains": [ 00:19:23.041 { 00:19:23.041 "dma_device_id": "system", 00:19:23.041 "dma_device_type": 1 00:19:23.041 }, 00:19:23.042 { 00:19:23.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.042 "dma_device_type": 2 00:19:23.042 } 00:19:23.042 ], 00:19:23.042 "driver_specific": {} 00:19:23.042 }' 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.042 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.300 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.300 22:48:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.300 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.300 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.300 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.300 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:23.300 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:23.559 "name": "BaseBdev3", 00:19:23.559 "aliases": [ 00:19:23.559 "e7ae74c5-66be-4eaa-a632-9c1295830def" 00:19:23.559 ], 00:19:23.559 "product_name": "Malloc disk", 00:19:23.559 "block_size": 512, 00:19:23.559 "num_blocks": 65536, 00:19:23.559 "uuid": "e7ae74c5-66be-4eaa-a632-9c1295830def", 00:19:23.559 "assigned_rate_limits": { 00:19:23.559 "rw_ios_per_sec": 0, 00:19:23.559 "rw_mbytes_per_sec": 0, 00:19:23.559 "r_mbytes_per_sec": 0, 00:19:23.559 "w_mbytes_per_sec": 0 00:19:23.559 }, 00:19:23.559 "claimed": true, 00:19:23.559 "claim_type": "exclusive_write", 00:19:23.559 "zoned": false, 00:19:23.559 "supported_io_types": { 00:19:23.559 "read": true, 00:19:23.559 "write": true, 00:19:23.559 "unmap": true, 00:19:23.559 "flush": true, 00:19:23.559 "reset": true, 00:19:23.559 "nvme_admin": false, 00:19:23.559 "nvme_io": false, 00:19:23.559 "nvme_io_md": false, 00:19:23.559 "write_zeroes": true, 00:19:23.559 "zcopy": true, 00:19:23.559 "get_zone_info": false, 00:19:23.559 "zone_management": false, 00:19:23.559 "zone_append": false, 00:19:23.559 "compare": false, 00:19:23.559 "compare_and_write": false, 00:19:23.559 "abort": true, 00:19:23.559 "seek_hole": false, 00:19:23.559 "seek_data": false, 00:19:23.559 "copy": true, 00:19:23.559 "nvme_iov_md": false 00:19:23.559 }, 00:19:23.559 "memory_domains": [ 00:19:23.559 { 00:19:23.559 "dma_device_id": "system", 00:19:23.559 "dma_device_type": 1 00:19:23.559 }, 00:19:23.559 { 00:19:23.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.559 "dma_device_type": 2 00:19:23.559 } 00:19:23.559 ], 00:19:23.559 "driver_specific": {} 00:19:23.559 }' 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.559 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.818 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.818 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.818 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.818 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.818 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.818 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:23.818 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.077 "name": "BaseBdev4", 00:19:24.077 "aliases": [ 00:19:24.077 "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d" 00:19:24.077 ], 00:19:24.077 "product_name": "Malloc disk", 00:19:24.077 "block_size": 512, 00:19:24.077 "num_blocks": 65536, 00:19:24.077 "uuid": "29bc0733-e00f-4ee5-bd0b-ee324a4aff2d", 00:19:24.077 "assigned_rate_limits": { 00:19:24.077 "rw_ios_per_sec": 0, 00:19:24.077 "rw_mbytes_per_sec": 0, 00:19:24.077 "r_mbytes_per_sec": 0, 00:19:24.077 "w_mbytes_per_sec": 0 00:19:24.077 }, 00:19:24.077 "claimed": true, 00:19:24.077 "claim_type": "exclusive_write", 00:19:24.077 "zoned": false, 00:19:24.077 "supported_io_types": { 00:19:24.077 "read": true, 00:19:24.077 "write": true, 00:19:24.077 "unmap": true, 00:19:24.077 "flush": true, 00:19:24.077 "reset": true, 00:19:24.077 "nvme_admin": false, 00:19:24.077 "nvme_io": false, 00:19:24.077 "nvme_io_md": false, 00:19:24.077 "write_zeroes": true, 00:19:24.077 "zcopy": true, 00:19:24.077 "get_zone_info": false, 00:19:24.077 "zone_management": false, 00:19:24.077 "zone_append": false, 00:19:24.077 "compare": false, 00:19:24.077 "compare_and_write": false, 00:19:24.077 "abort": true, 00:19:24.077 "seek_hole": false, 00:19:24.077 "seek_data": false, 00:19:24.077 "copy": true, 00:19:24.077 "nvme_iov_md": false 00:19:24.077 }, 00:19:24.077 "memory_domains": [ 00:19:24.077 { 00:19:24.077 "dma_device_id": "system", 00:19:24.077 "dma_device_type": 1 00:19:24.077 }, 00:19:24.077 { 00:19:24.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.077 "dma_device_type": 2 00:19:24.077 } 00:19:24.077 ], 00:19:24.077 "driver_specific": {} 00:19:24.077 }' 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.077 22:48:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.336 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.336 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.336 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.336 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.336 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.336 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:24.595 [2024-07-15 22:48:09.292711] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:24.595 [2024-07-15 22:48:09.292740] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:24.595 [2024-07-15 22:48:09.292794] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:24.595 [2024-07-15 22:48:09.292859] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:24.595 [2024-07-15 22:48:09.292871] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1993470 name Existed_Raid, state offline 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2765813 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2765813 ']' 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2765813 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2765813 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2765813' 00:19:24.595 killing process with pid 2765813 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2765813 00:19:24.595 [2024-07-15 22:48:09.364505] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:24.595 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2765813 00:19:24.595 [2024-07-15 22:48:09.402395] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:24.854 22:48:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:24.854 00:19:24.854 real 0m33.768s 00:19:24.854 user 1m1.973s 00:19:24.854 sys 0m6.038s 00:19:24.854 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:24.854 22:48:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:24.854 ************************************ 00:19:24.854 END TEST raid_state_function_test_sb 00:19:24.854 ************************************ 00:19:24.854 22:48:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:24.854 22:48:09 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:19:24.854 22:48:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:24.854 22:48:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:24.854 22:48:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:24.854 ************************************ 00:19:24.854 START TEST raid_superblock_test 00:19:24.854 ************************************ 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2771371 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2771371 /var/tmp/spdk-raid.sock 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2771371 ']' 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:24.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:24.854 22:48:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.112 [2024-07-15 22:48:09.784920] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:19:25.112 [2024-07-15 22:48:09.784999] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2771371 ] 00:19:25.112 [2024-07-15 22:48:09.916600] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:25.371 [2024-07-15 22:48:10.029185] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:25.371 [2024-07-15 22:48:10.089170] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:25.371 [2024-07-15 22:48:10.089219] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:25.938 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:26.196 malloc1 00:19:26.196 22:48:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:26.455 [2024-07-15 22:48:11.199880] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:26.455 [2024-07-15 22:48:11.199939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:26.455 [2024-07-15 22:48:11.199959] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153a570 00:19:26.455 [2024-07-15 22:48:11.199971] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:26.455 [2024-07-15 22:48:11.201696] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:26.455 [2024-07-15 22:48:11.201728] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:26.455 pt1 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:26.455 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:26.713 malloc2 00:19:26.713 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:26.972 [2024-07-15 22:48:11.633784] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:26.972 [2024-07-15 22:48:11.633833] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:26.972 [2024-07-15 22:48:11.633850] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153b970 00:19:26.972 [2024-07-15 22:48:11.633863] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:26.972 [2024-07-15 22:48:11.635347] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:26.972 [2024-07-15 22:48:11.635377] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:26.972 pt2 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:26.972 malloc3 00:19:26.972 22:48:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:27.231 [2024-07-15 22:48:12.075787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:27.231 [2024-07-15 22:48:12.075834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:27.231 [2024-07-15 22:48:12.075851] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d2340 00:19:27.231 [2024-07-15 22:48:12.075863] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:27.231 [2024-07-15 22:48:12.077278] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:27.231 [2024-07-15 22:48:12.077307] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:27.231 pt3 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:27.231 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:27.489 malloc4 00:19:27.489 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:27.748 [2024-07-15 22:48:12.577782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:27.748 [2024-07-15 22:48:12.577836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:27.748 [2024-07-15 22:48:12.577855] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d4c60 00:19:27.748 [2024-07-15 22:48:12.577869] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:27.748 [2024-07-15 22:48:12.579278] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:27.748 [2024-07-15 22:48:12.579307] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:27.748 pt4 00:19:27.748 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:27.748 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:27.748 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:28.007 [2024-07-15 22:48:12.826480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:28.007 [2024-07-15 22:48:12.827648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:28.007 [2024-07-15 22:48:12.827701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:28.007 [2024-07-15 22:48:12.827744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:28.007 [2024-07-15 22:48:12.827910] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1532530 00:19:28.007 [2024-07-15 22:48:12.827921] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:28.007 [2024-07-15 22:48:12.828109] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1530770 00:19:28.007 [2024-07-15 22:48:12.828250] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1532530 00:19:28.007 [2024-07-15 22:48:12.828261] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1532530 00:19:28.007 [2024-07-15 22:48:12.828350] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.007 22:48:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.266 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.266 "name": "raid_bdev1", 00:19:28.266 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:28.266 "strip_size_kb": 64, 00:19:28.266 "state": "online", 00:19:28.266 "raid_level": "raid0", 00:19:28.266 "superblock": true, 00:19:28.266 "num_base_bdevs": 4, 00:19:28.266 "num_base_bdevs_discovered": 4, 00:19:28.266 "num_base_bdevs_operational": 4, 00:19:28.266 "base_bdevs_list": [ 00:19:28.266 { 00:19:28.266 "name": "pt1", 00:19:28.266 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:28.266 "is_configured": true, 00:19:28.266 "data_offset": 2048, 00:19:28.266 "data_size": 63488 00:19:28.266 }, 00:19:28.266 { 00:19:28.266 "name": "pt2", 00:19:28.266 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:28.266 "is_configured": true, 00:19:28.266 "data_offset": 2048, 00:19:28.266 "data_size": 63488 00:19:28.266 }, 00:19:28.266 { 00:19:28.266 "name": "pt3", 00:19:28.266 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:28.266 "is_configured": true, 00:19:28.266 "data_offset": 2048, 00:19:28.266 "data_size": 63488 00:19:28.266 }, 00:19:28.266 { 00:19:28.266 "name": "pt4", 00:19:28.266 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:28.266 "is_configured": true, 00:19:28.266 "data_offset": 2048, 00:19:28.266 "data_size": 63488 00:19:28.266 } 00:19:28.266 ] 00:19:28.266 }' 00:19:28.266 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.266 22:48:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:28.840 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:29.098 [2024-07-15 22:48:13.833442] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:29.098 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:29.098 "name": "raid_bdev1", 00:19:29.098 "aliases": [ 00:19:29.098 "a6c9583f-5590-4506-a18f-068aec84af12" 00:19:29.098 ], 00:19:29.098 "product_name": "Raid Volume", 00:19:29.098 "block_size": 512, 00:19:29.098 "num_blocks": 253952, 00:19:29.098 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:29.098 "assigned_rate_limits": { 00:19:29.098 "rw_ios_per_sec": 0, 00:19:29.098 "rw_mbytes_per_sec": 0, 00:19:29.098 "r_mbytes_per_sec": 0, 00:19:29.098 "w_mbytes_per_sec": 0 00:19:29.098 }, 00:19:29.098 "claimed": false, 00:19:29.098 "zoned": false, 00:19:29.098 "supported_io_types": { 00:19:29.098 "read": true, 00:19:29.098 "write": true, 00:19:29.098 "unmap": true, 00:19:29.098 "flush": true, 00:19:29.098 "reset": true, 00:19:29.098 "nvme_admin": false, 00:19:29.098 "nvme_io": false, 00:19:29.098 "nvme_io_md": false, 00:19:29.098 "write_zeroes": true, 00:19:29.098 "zcopy": false, 00:19:29.098 "get_zone_info": false, 00:19:29.098 "zone_management": false, 00:19:29.098 "zone_append": false, 00:19:29.098 "compare": false, 00:19:29.098 "compare_and_write": false, 00:19:29.098 "abort": false, 00:19:29.098 "seek_hole": false, 00:19:29.098 "seek_data": false, 00:19:29.098 "copy": false, 00:19:29.098 "nvme_iov_md": false 00:19:29.098 }, 00:19:29.098 "memory_domains": [ 00:19:29.098 { 00:19:29.098 "dma_device_id": "system", 00:19:29.098 "dma_device_type": 1 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.098 "dma_device_type": 2 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "dma_device_id": "system", 00:19:29.098 "dma_device_type": 1 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.098 "dma_device_type": 2 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "dma_device_id": "system", 00:19:29.098 "dma_device_type": 1 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.098 "dma_device_type": 2 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "dma_device_id": "system", 00:19:29.098 "dma_device_type": 1 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.098 "dma_device_type": 2 00:19:29.098 } 00:19:29.098 ], 00:19:29.098 "driver_specific": { 00:19:29.098 "raid": { 00:19:29.098 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:29.098 "strip_size_kb": 64, 00:19:29.098 "state": "online", 00:19:29.098 "raid_level": "raid0", 00:19:29.098 "superblock": true, 00:19:29.098 "num_base_bdevs": 4, 00:19:29.098 "num_base_bdevs_discovered": 4, 00:19:29.098 "num_base_bdevs_operational": 4, 00:19:29.098 "base_bdevs_list": [ 00:19:29.098 { 00:19:29.098 "name": "pt1", 00:19:29.098 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:29.098 "is_configured": true, 00:19:29.098 "data_offset": 2048, 00:19:29.098 "data_size": 63488 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "name": "pt2", 00:19:29.098 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:29.098 "is_configured": true, 00:19:29.098 "data_offset": 2048, 00:19:29.098 "data_size": 63488 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "name": "pt3", 00:19:29.098 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:29.098 "is_configured": true, 00:19:29.098 "data_offset": 2048, 00:19:29.098 "data_size": 63488 00:19:29.098 }, 00:19:29.098 { 00:19:29.098 "name": "pt4", 00:19:29.098 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:29.098 "is_configured": true, 00:19:29.098 "data_offset": 2048, 00:19:29.098 "data_size": 63488 00:19:29.098 } 00:19:29.098 ] 00:19:29.098 } 00:19:29.098 } 00:19:29.098 }' 00:19:29.098 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:29.098 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:29.098 pt2 00:19:29.098 pt3 00:19:29.098 pt4' 00:19:29.098 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.098 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:29.098 22:48:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.665 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.665 "name": "pt1", 00:19:29.665 "aliases": [ 00:19:29.665 "00000000-0000-0000-0000-000000000001" 00:19:29.665 ], 00:19:29.665 "product_name": "passthru", 00:19:29.665 "block_size": 512, 00:19:29.665 "num_blocks": 65536, 00:19:29.665 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:29.665 "assigned_rate_limits": { 00:19:29.665 "rw_ios_per_sec": 0, 00:19:29.665 "rw_mbytes_per_sec": 0, 00:19:29.665 "r_mbytes_per_sec": 0, 00:19:29.666 "w_mbytes_per_sec": 0 00:19:29.666 }, 00:19:29.666 "claimed": true, 00:19:29.666 "claim_type": "exclusive_write", 00:19:29.666 "zoned": false, 00:19:29.666 "supported_io_types": { 00:19:29.666 "read": true, 00:19:29.666 "write": true, 00:19:29.666 "unmap": true, 00:19:29.666 "flush": true, 00:19:29.666 "reset": true, 00:19:29.666 "nvme_admin": false, 00:19:29.666 "nvme_io": false, 00:19:29.666 "nvme_io_md": false, 00:19:29.666 "write_zeroes": true, 00:19:29.666 "zcopy": true, 00:19:29.666 "get_zone_info": false, 00:19:29.666 "zone_management": false, 00:19:29.666 "zone_append": false, 00:19:29.666 "compare": false, 00:19:29.666 "compare_and_write": false, 00:19:29.666 "abort": true, 00:19:29.666 "seek_hole": false, 00:19:29.666 "seek_data": false, 00:19:29.666 "copy": true, 00:19:29.666 "nvme_iov_md": false 00:19:29.666 }, 00:19:29.666 "memory_domains": [ 00:19:29.666 { 00:19:29.666 "dma_device_id": "system", 00:19:29.666 "dma_device_type": 1 00:19:29.666 }, 00:19:29.666 { 00:19:29.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.666 "dma_device_type": 2 00:19:29.666 } 00:19:29.666 ], 00:19:29.666 "driver_specific": { 00:19:29.666 "passthru": { 00:19:29.666 "name": "pt1", 00:19:29.666 "base_bdev_name": "malloc1" 00:19:29.666 } 00:19:29.666 } 00:19:29.666 }' 00:19:29.666 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.666 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.666 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.666 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.925 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.925 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.925 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.925 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.925 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.925 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.925 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.184 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.184 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.184 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:30.184 22:48:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.444 "name": "pt2", 00:19:30.444 "aliases": [ 00:19:30.444 "00000000-0000-0000-0000-000000000002" 00:19:30.444 ], 00:19:30.444 "product_name": "passthru", 00:19:30.444 "block_size": 512, 00:19:30.444 "num_blocks": 65536, 00:19:30.444 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:30.444 "assigned_rate_limits": { 00:19:30.444 "rw_ios_per_sec": 0, 00:19:30.444 "rw_mbytes_per_sec": 0, 00:19:30.444 "r_mbytes_per_sec": 0, 00:19:30.444 "w_mbytes_per_sec": 0 00:19:30.444 }, 00:19:30.444 "claimed": true, 00:19:30.444 "claim_type": "exclusive_write", 00:19:30.444 "zoned": false, 00:19:30.444 "supported_io_types": { 00:19:30.444 "read": true, 00:19:30.444 "write": true, 00:19:30.444 "unmap": true, 00:19:30.444 "flush": true, 00:19:30.444 "reset": true, 00:19:30.444 "nvme_admin": false, 00:19:30.444 "nvme_io": false, 00:19:30.444 "nvme_io_md": false, 00:19:30.444 "write_zeroes": true, 00:19:30.444 "zcopy": true, 00:19:30.444 "get_zone_info": false, 00:19:30.444 "zone_management": false, 00:19:30.444 "zone_append": false, 00:19:30.444 "compare": false, 00:19:30.444 "compare_and_write": false, 00:19:30.444 "abort": true, 00:19:30.444 "seek_hole": false, 00:19:30.444 "seek_data": false, 00:19:30.444 "copy": true, 00:19:30.444 "nvme_iov_md": false 00:19:30.444 }, 00:19:30.444 "memory_domains": [ 00:19:30.444 { 00:19:30.444 "dma_device_id": "system", 00:19:30.444 "dma_device_type": 1 00:19:30.444 }, 00:19:30.444 { 00:19:30.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.444 "dma_device_type": 2 00:19:30.444 } 00:19:30.444 ], 00:19:30.444 "driver_specific": { 00:19:30.444 "passthru": { 00:19:30.444 "name": "pt2", 00:19:30.444 "base_bdev_name": "malloc2" 00:19:30.444 } 00:19:30.444 } 00:19:30.444 }' 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.444 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:30.703 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.962 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.962 "name": "pt3", 00:19:30.962 "aliases": [ 00:19:30.962 "00000000-0000-0000-0000-000000000003" 00:19:30.962 ], 00:19:30.962 "product_name": "passthru", 00:19:30.962 "block_size": 512, 00:19:30.962 "num_blocks": 65536, 00:19:30.962 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:30.962 "assigned_rate_limits": { 00:19:30.962 "rw_ios_per_sec": 0, 00:19:30.962 "rw_mbytes_per_sec": 0, 00:19:30.962 "r_mbytes_per_sec": 0, 00:19:30.962 "w_mbytes_per_sec": 0 00:19:30.962 }, 00:19:30.962 "claimed": true, 00:19:30.962 "claim_type": "exclusive_write", 00:19:30.962 "zoned": false, 00:19:30.962 "supported_io_types": { 00:19:30.962 "read": true, 00:19:30.962 "write": true, 00:19:30.962 "unmap": true, 00:19:30.962 "flush": true, 00:19:30.962 "reset": true, 00:19:30.962 "nvme_admin": false, 00:19:30.962 "nvme_io": false, 00:19:30.962 "nvme_io_md": false, 00:19:30.962 "write_zeroes": true, 00:19:30.962 "zcopy": true, 00:19:30.962 "get_zone_info": false, 00:19:30.962 "zone_management": false, 00:19:30.962 "zone_append": false, 00:19:30.962 "compare": false, 00:19:30.962 "compare_and_write": false, 00:19:30.962 "abort": true, 00:19:30.962 "seek_hole": false, 00:19:30.962 "seek_data": false, 00:19:30.962 "copy": true, 00:19:30.962 "nvme_iov_md": false 00:19:30.962 }, 00:19:30.962 "memory_domains": [ 00:19:30.962 { 00:19:30.962 "dma_device_id": "system", 00:19:30.962 "dma_device_type": 1 00:19:30.962 }, 00:19:30.962 { 00:19:30.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.962 "dma_device_type": 2 00:19:30.962 } 00:19:30.962 ], 00:19:30.962 "driver_specific": { 00:19:30.962 "passthru": { 00:19:30.962 "name": "pt3", 00:19:30.962 "base_bdev_name": "malloc3" 00:19:30.962 } 00:19:30.962 } 00:19:30.962 }' 00:19:30.962 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.962 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.962 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.962 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.220 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.220 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:31.220 22:48:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.220 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.220 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:31.220 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.477 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.477 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.477 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:31.477 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:31.477 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:32.043 "name": "pt4", 00:19:32.043 "aliases": [ 00:19:32.043 "00000000-0000-0000-0000-000000000004" 00:19:32.043 ], 00:19:32.043 "product_name": "passthru", 00:19:32.043 "block_size": 512, 00:19:32.043 "num_blocks": 65536, 00:19:32.043 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:32.043 "assigned_rate_limits": { 00:19:32.043 "rw_ios_per_sec": 0, 00:19:32.043 "rw_mbytes_per_sec": 0, 00:19:32.043 "r_mbytes_per_sec": 0, 00:19:32.043 "w_mbytes_per_sec": 0 00:19:32.043 }, 00:19:32.043 "claimed": true, 00:19:32.043 "claim_type": "exclusive_write", 00:19:32.043 "zoned": false, 00:19:32.043 "supported_io_types": { 00:19:32.043 "read": true, 00:19:32.043 "write": true, 00:19:32.043 "unmap": true, 00:19:32.043 "flush": true, 00:19:32.043 "reset": true, 00:19:32.043 "nvme_admin": false, 00:19:32.043 "nvme_io": false, 00:19:32.043 "nvme_io_md": false, 00:19:32.043 "write_zeroes": true, 00:19:32.043 "zcopy": true, 00:19:32.043 "get_zone_info": false, 00:19:32.043 "zone_management": false, 00:19:32.043 "zone_append": false, 00:19:32.043 "compare": false, 00:19:32.043 "compare_and_write": false, 00:19:32.043 "abort": true, 00:19:32.043 "seek_hole": false, 00:19:32.043 "seek_data": false, 00:19:32.043 "copy": true, 00:19:32.043 "nvme_iov_md": false 00:19:32.043 }, 00:19:32.043 "memory_domains": [ 00:19:32.043 { 00:19:32.043 "dma_device_id": "system", 00:19:32.043 "dma_device_type": 1 00:19:32.043 }, 00:19:32.043 { 00:19:32.043 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:32.043 "dma_device_type": 2 00:19:32.043 } 00:19:32.043 ], 00:19:32.043 "driver_specific": { 00:19:32.043 "passthru": { 00:19:32.043 "name": "pt4", 00:19:32.043 "base_bdev_name": "malloc4" 00:19:32.043 } 00:19:32.043 } 00:19:32.043 }' 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:32.043 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:32.300 22:48:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:32.300 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:32.300 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:32.300 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:32.300 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:32.300 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:32.558 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:32.558 [2024-07-15 22:48:17.435070] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:32.558 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a6c9583f-5590-4506-a18f-068aec84af12 00:19:32.558 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a6c9583f-5590-4506-a18f-068aec84af12 ']' 00:19:32.558 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:33.123 [2024-07-15 22:48:17.936070] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:33.123 [2024-07-15 22:48:17.936100] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:33.123 [2024-07-15 22:48:17.936169] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:33.123 [2024-07-15 22:48:17.936235] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:33.123 [2024-07-15 22:48:17.936248] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1532530 name raid_bdev1, state offline 00:19:33.123 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.123 22:48:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:33.381 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:33.381 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:33.381 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:33.381 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:33.638 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:33.638 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:34.203 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:34.203 22:48:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:34.460 22:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:34.460 22:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:35.025 22:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:35.025 22:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:35.283 22:48:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:35.541 [2024-07-15 22:48:20.197970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:35.541 [2024-07-15 22:48:20.199328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:35.541 [2024-07-15 22:48:20.199372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:35.541 [2024-07-15 22:48:20.199406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:35.541 [2024-07-15 22:48:20.199452] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:35.541 [2024-07-15 22:48:20.199492] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:35.541 [2024-07-15 22:48:20.199515] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:35.541 [2024-07-15 22:48:20.199537] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:35.541 [2024-07-15 22:48:20.199555] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:35.541 [2024-07-15 22:48:20.199566] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ddff0 name raid_bdev1, state configuring 00:19:35.541 request: 00:19:35.541 { 00:19:35.541 "name": "raid_bdev1", 00:19:35.541 "raid_level": "raid0", 00:19:35.541 "base_bdevs": [ 00:19:35.541 "malloc1", 00:19:35.541 "malloc2", 00:19:35.541 "malloc3", 00:19:35.541 "malloc4" 00:19:35.541 ], 00:19:35.541 "strip_size_kb": 64, 00:19:35.541 "superblock": false, 00:19:35.541 "method": "bdev_raid_create", 00:19:35.541 "req_id": 1 00:19:35.541 } 00:19:35.541 Got JSON-RPC error response 00:19:35.541 response: 00:19:35.541 { 00:19:35.541 "code": -17, 00:19:35.541 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:35.541 } 00:19:35.541 22:48:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:35.541 22:48:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:35.541 22:48:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:35.541 22:48:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:35.541 22:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.541 22:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:36.108 22:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:36.108 22:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:36.108 22:48:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:36.108 [2024-07-15 22:48:21.012024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:36.108 [2024-07-15 22:48:21.012079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:36.108 [2024-07-15 22:48:21.012102] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x153a7a0 00:19:36.108 [2024-07-15 22:48:21.012115] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:36.108 [2024-07-15 22:48:21.013844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:36.108 [2024-07-15 22:48:21.013877] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:36.108 [2024-07-15 22:48:21.013967] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:36.108 [2024-07-15 22:48:21.013998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:36.108 pt1 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.367 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:36.626 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.627 "name": "raid_bdev1", 00:19:36.627 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:36.627 "strip_size_kb": 64, 00:19:36.627 "state": "configuring", 00:19:36.627 "raid_level": "raid0", 00:19:36.627 "superblock": true, 00:19:36.627 "num_base_bdevs": 4, 00:19:36.627 "num_base_bdevs_discovered": 1, 00:19:36.627 "num_base_bdevs_operational": 4, 00:19:36.627 "base_bdevs_list": [ 00:19:36.627 { 00:19:36.627 "name": "pt1", 00:19:36.627 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:36.627 "is_configured": true, 00:19:36.627 "data_offset": 2048, 00:19:36.627 "data_size": 63488 00:19:36.627 }, 00:19:36.627 { 00:19:36.627 "name": null, 00:19:36.627 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:36.627 "is_configured": false, 00:19:36.627 "data_offset": 2048, 00:19:36.627 "data_size": 63488 00:19:36.627 }, 00:19:36.627 { 00:19:36.627 "name": null, 00:19:36.627 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:36.627 "is_configured": false, 00:19:36.627 "data_offset": 2048, 00:19:36.627 "data_size": 63488 00:19:36.627 }, 00:19:36.627 { 00:19:36.627 "name": null, 00:19:36.627 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:36.627 "is_configured": false, 00:19:36.627 "data_offset": 2048, 00:19:36.627 "data_size": 63488 00:19:36.627 } 00:19:36.627 ] 00:19:36.627 }' 00:19:36.627 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.627 22:48:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.195 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:37.195 22:48:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:37.195 [2024-07-15 22:48:22.062804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:37.195 [2024-07-15 22:48:22.062859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:37.195 [2024-07-15 22:48:22.062880] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16d3940 00:19:37.195 [2024-07-15 22:48:22.062892] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:37.195 [2024-07-15 22:48:22.063264] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:37.195 [2024-07-15 22:48:22.063285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:37.195 [2024-07-15 22:48:22.063353] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:37.195 [2024-07-15 22:48:22.063372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:37.195 pt2 00:19:37.195 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:37.476 [2024-07-15 22:48:22.311475] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.476 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.477 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.797 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:37.797 "name": "raid_bdev1", 00:19:37.797 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:37.797 "strip_size_kb": 64, 00:19:37.797 "state": "configuring", 00:19:37.797 "raid_level": "raid0", 00:19:37.797 "superblock": true, 00:19:37.797 "num_base_bdevs": 4, 00:19:37.797 "num_base_bdevs_discovered": 1, 00:19:37.797 "num_base_bdevs_operational": 4, 00:19:37.797 "base_bdevs_list": [ 00:19:37.797 { 00:19:37.797 "name": "pt1", 00:19:37.797 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:37.797 "is_configured": true, 00:19:37.798 "data_offset": 2048, 00:19:37.798 "data_size": 63488 00:19:37.798 }, 00:19:37.798 { 00:19:37.798 "name": null, 00:19:37.798 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:37.798 "is_configured": false, 00:19:37.798 "data_offset": 2048, 00:19:37.798 "data_size": 63488 00:19:37.798 }, 00:19:37.798 { 00:19:37.798 "name": null, 00:19:37.798 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:37.798 "is_configured": false, 00:19:37.798 "data_offset": 2048, 00:19:37.798 "data_size": 63488 00:19:37.798 }, 00:19:37.798 { 00:19:37.798 "name": null, 00:19:37.798 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:37.798 "is_configured": false, 00:19:37.798 "data_offset": 2048, 00:19:37.798 "data_size": 63488 00:19:37.798 } 00:19:37.798 ] 00:19:37.798 }' 00:19:37.798 22:48:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:37.798 22:48:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:38.364 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:38.364 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:38.364 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:38.622 [2024-07-15 22:48:23.402372] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:38.622 [2024-07-15 22:48:23.402430] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.622 [2024-07-15 22:48:23.402449] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1531060 00:19:38.622 [2024-07-15 22:48:23.402461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.622 [2024-07-15 22:48:23.402816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.622 [2024-07-15 22:48:23.402835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:38.622 [2024-07-15 22:48:23.402902] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:38.622 [2024-07-15 22:48:23.402922] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:38.622 pt2 00:19:38.622 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:38.622 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:38.622 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:38.880 [2024-07-15 22:48:23.651033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:38.880 [2024-07-15 22:48:23.651069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.880 [2024-07-15 22:48:23.651088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15338d0 00:19:38.880 [2024-07-15 22:48:23.651105] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.880 [2024-07-15 22:48:23.651403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.880 [2024-07-15 22:48:23.651421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:38.880 [2024-07-15 22:48:23.651471] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:38.880 [2024-07-15 22:48:23.651487] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:38.880 pt3 00:19:38.880 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:38.880 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:38.880 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:39.138 [2024-07-15 22:48:23.899687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:39.138 [2024-07-15 22:48:23.899726] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:39.138 [2024-07-15 22:48:23.899741] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1534b80 00:19:39.138 [2024-07-15 22:48:23.899753] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:39.138 [2024-07-15 22:48:23.900031] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:39.138 [2024-07-15 22:48:23.900049] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:39.138 [2024-07-15 22:48:23.900096] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:39.138 [2024-07-15 22:48:23.900112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:39.138 [2024-07-15 22:48:23.900227] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1531780 00:19:39.138 [2024-07-15 22:48:23.900238] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:39.138 [2024-07-15 22:48:23.900405] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1536d70 00:19:39.138 [2024-07-15 22:48:23.900531] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1531780 00:19:39.138 [2024-07-15 22:48:23.900540] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1531780 00:19:39.138 [2024-07-15 22:48:23.900635] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:39.138 pt4 00:19:39.138 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:39.138 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:39.139 22:48:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.397 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.397 "name": "raid_bdev1", 00:19:39.397 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:39.397 "strip_size_kb": 64, 00:19:39.397 "state": "online", 00:19:39.397 "raid_level": "raid0", 00:19:39.397 "superblock": true, 00:19:39.397 "num_base_bdevs": 4, 00:19:39.397 "num_base_bdevs_discovered": 4, 00:19:39.397 "num_base_bdevs_operational": 4, 00:19:39.397 "base_bdevs_list": [ 00:19:39.397 { 00:19:39.397 "name": "pt1", 00:19:39.397 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:39.397 "is_configured": true, 00:19:39.397 "data_offset": 2048, 00:19:39.397 "data_size": 63488 00:19:39.397 }, 00:19:39.397 { 00:19:39.397 "name": "pt2", 00:19:39.397 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:39.397 "is_configured": true, 00:19:39.397 "data_offset": 2048, 00:19:39.397 "data_size": 63488 00:19:39.397 }, 00:19:39.397 { 00:19:39.397 "name": "pt3", 00:19:39.397 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:39.397 "is_configured": true, 00:19:39.397 "data_offset": 2048, 00:19:39.397 "data_size": 63488 00:19:39.397 }, 00:19:39.397 { 00:19:39.397 "name": "pt4", 00:19:39.397 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:39.397 "is_configured": true, 00:19:39.397 "data_offset": 2048, 00:19:39.397 "data_size": 63488 00:19:39.397 } 00:19:39.397 ] 00:19:39.397 }' 00:19:39.397 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.397 22:48:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:39.964 22:48:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:40.233 [2024-07-15 22:48:25.007001] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:40.233 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:40.233 "name": "raid_bdev1", 00:19:40.233 "aliases": [ 00:19:40.233 "a6c9583f-5590-4506-a18f-068aec84af12" 00:19:40.233 ], 00:19:40.233 "product_name": "Raid Volume", 00:19:40.233 "block_size": 512, 00:19:40.233 "num_blocks": 253952, 00:19:40.233 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:40.233 "assigned_rate_limits": { 00:19:40.233 "rw_ios_per_sec": 0, 00:19:40.233 "rw_mbytes_per_sec": 0, 00:19:40.233 "r_mbytes_per_sec": 0, 00:19:40.233 "w_mbytes_per_sec": 0 00:19:40.233 }, 00:19:40.233 "claimed": false, 00:19:40.233 "zoned": false, 00:19:40.233 "supported_io_types": { 00:19:40.233 "read": true, 00:19:40.233 "write": true, 00:19:40.233 "unmap": true, 00:19:40.233 "flush": true, 00:19:40.233 "reset": true, 00:19:40.233 "nvme_admin": false, 00:19:40.233 "nvme_io": false, 00:19:40.233 "nvme_io_md": false, 00:19:40.233 "write_zeroes": true, 00:19:40.233 "zcopy": false, 00:19:40.233 "get_zone_info": false, 00:19:40.233 "zone_management": false, 00:19:40.233 "zone_append": false, 00:19:40.233 "compare": false, 00:19:40.233 "compare_and_write": false, 00:19:40.233 "abort": false, 00:19:40.233 "seek_hole": false, 00:19:40.233 "seek_data": false, 00:19:40.233 "copy": false, 00:19:40.233 "nvme_iov_md": false 00:19:40.233 }, 00:19:40.233 "memory_domains": [ 00:19:40.233 { 00:19:40.233 "dma_device_id": "system", 00:19:40.233 "dma_device_type": 1 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.233 "dma_device_type": 2 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "dma_device_id": "system", 00:19:40.233 "dma_device_type": 1 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.233 "dma_device_type": 2 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "dma_device_id": "system", 00:19:40.233 "dma_device_type": 1 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.233 "dma_device_type": 2 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "dma_device_id": "system", 00:19:40.233 "dma_device_type": 1 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.233 "dma_device_type": 2 00:19:40.233 } 00:19:40.233 ], 00:19:40.233 "driver_specific": { 00:19:40.233 "raid": { 00:19:40.233 "uuid": "a6c9583f-5590-4506-a18f-068aec84af12", 00:19:40.233 "strip_size_kb": 64, 00:19:40.233 "state": "online", 00:19:40.233 "raid_level": "raid0", 00:19:40.233 "superblock": true, 00:19:40.233 "num_base_bdevs": 4, 00:19:40.233 "num_base_bdevs_discovered": 4, 00:19:40.233 "num_base_bdevs_operational": 4, 00:19:40.233 "base_bdevs_list": [ 00:19:40.233 { 00:19:40.233 "name": "pt1", 00:19:40.233 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:40.233 "is_configured": true, 00:19:40.233 "data_offset": 2048, 00:19:40.233 "data_size": 63488 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "name": "pt2", 00:19:40.233 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:40.233 "is_configured": true, 00:19:40.233 "data_offset": 2048, 00:19:40.233 "data_size": 63488 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "name": "pt3", 00:19:40.233 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:40.233 "is_configured": true, 00:19:40.233 "data_offset": 2048, 00:19:40.233 "data_size": 63488 00:19:40.233 }, 00:19:40.233 { 00:19:40.233 "name": "pt4", 00:19:40.233 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:40.233 "is_configured": true, 00:19:40.233 "data_offset": 2048, 00:19:40.233 "data_size": 63488 00:19:40.233 } 00:19:40.233 ] 00:19:40.233 } 00:19:40.233 } 00:19:40.233 }' 00:19:40.233 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:40.233 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:40.233 pt2 00:19:40.233 pt3 00:19:40.233 pt4' 00:19:40.233 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:40.233 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:40.233 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:40.492 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:40.492 "name": "pt1", 00:19:40.492 "aliases": [ 00:19:40.492 "00000000-0000-0000-0000-000000000001" 00:19:40.492 ], 00:19:40.492 "product_name": "passthru", 00:19:40.492 "block_size": 512, 00:19:40.492 "num_blocks": 65536, 00:19:40.492 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:40.492 "assigned_rate_limits": { 00:19:40.492 "rw_ios_per_sec": 0, 00:19:40.492 "rw_mbytes_per_sec": 0, 00:19:40.492 "r_mbytes_per_sec": 0, 00:19:40.492 "w_mbytes_per_sec": 0 00:19:40.492 }, 00:19:40.492 "claimed": true, 00:19:40.492 "claim_type": "exclusive_write", 00:19:40.492 "zoned": false, 00:19:40.492 "supported_io_types": { 00:19:40.492 "read": true, 00:19:40.492 "write": true, 00:19:40.492 "unmap": true, 00:19:40.492 "flush": true, 00:19:40.492 "reset": true, 00:19:40.492 "nvme_admin": false, 00:19:40.492 "nvme_io": false, 00:19:40.492 "nvme_io_md": false, 00:19:40.492 "write_zeroes": true, 00:19:40.492 "zcopy": true, 00:19:40.492 "get_zone_info": false, 00:19:40.492 "zone_management": false, 00:19:40.492 "zone_append": false, 00:19:40.492 "compare": false, 00:19:40.492 "compare_and_write": false, 00:19:40.492 "abort": true, 00:19:40.492 "seek_hole": false, 00:19:40.492 "seek_data": false, 00:19:40.492 "copy": true, 00:19:40.492 "nvme_iov_md": false 00:19:40.492 }, 00:19:40.492 "memory_domains": [ 00:19:40.492 { 00:19:40.492 "dma_device_id": "system", 00:19:40.492 "dma_device_type": 1 00:19:40.492 }, 00:19:40.492 { 00:19:40.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.492 "dma_device_type": 2 00:19:40.492 } 00:19:40.492 ], 00:19:40.492 "driver_specific": { 00:19:40.492 "passthru": { 00:19:40.492 "name": "pt1", 00:19:40.492 "base_bdev_name": "malloc1" 00:19:40.492 } 00:19:40.492 } 00:19:40.492 }' 00:19:40.492 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.492 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:40.751 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.010 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:41.010 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.010 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:41.010 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:41.010 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:41.010 "name": "pt2", 00:19:41.010 "aliases": [ 00:19:41.010 "00000000-0000-0000-0000-000000000002" 00:19:41.010 ], 00:19:41.010 "product_name": "passthru", 00:19:41.010 "block_size": 512, 00:19:41.010 "num_blocks": 65536, 00:19:41.010 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:41.010 "assigned_rate_limits": { 00:19:41.010 "rw_ios_per_sec": 0, 00:19:41.010 "rw_mbytes_per_sec": 0, 00:19:41.010 "r_mbytes_per_sec": 0, 00:19:41.010 "w_mbytes_per_sec": 0 00:19:41.010 }, 00:19:41.010 "claimed": true, 00:19:41.010 "claim_type": "exclusive_write", 00:19:41.010 "zoned": false, 00:19:41.010 "supported_io_types": { 00:19:41.010 "read": true, 00:19:41.010 "write": true, 00:19:41.010 "unmap": true, 00:19:41.010 "flush": true, 00:19:41.010 "reset": true, 00:19:41.010 "nvme_admin": false, 00:19:41.010 "nvme_io": false, 00:19:41.010 "nvme_io_md": false, 00:19:41.010 "write_zeroes": true, 00:19:41.010 "zcopy": true, 00:19:41.010 "get_zone_info": false, 00:19:41.010 "zone_management": false, 00:19:41.010 "zone_append": false, 00:19:41.010 "compare": false, 00:19:41.010 "compare_and_write": false, 00:19:41.010 "abort": true, 00:19:41.010 "seek_hole": false, 00:19:41.010 "seek_data": false, 00:19:41.010 "copy": true, 00:19:41.010 "nvme_iov_md": false 00:19:41.010 }, 00:19:41.010 "memory_domains": [ 00:19:41.010 { 00:19:41.010 "dma_device_id": "system", 00:19:41.010 "dma_device_type": 1 00:19:41.010 }, 00:19:41.010 { 00:19:41.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.010 "dma_device_type": 2 00:19:41.010 } 00:19:41.010 ], 00:19:41.010 "driver_specific": { 00:19:41.010 "passthru": { 00:19:41.010 "name": "pt2", 00:19:41.010 "base_bdev_name": "malloc2" 00:19:41.010 } 00:19:41.010 } 00:19:41.010 }' 00:19:41.010 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.269 22:48:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.269 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:41.269 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.269 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.269 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:41.269 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.269 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.528 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.528 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.528 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.528 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:41.528 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.528 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:41.528 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:41.787 "name": "pt3", 00:19:41.787 "aliases": [ 00:19:41.787 "00000000-0000-0000-0000-000000000003" 00:19:41.787 ], 00:19:41.787 "product_name": "passthru", 00:19:41.787 "block_size": 512, 00:19:41.787 "num_blocks": 65536, 00:19:41.787 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:41.787 "assigned_rate_limits": { 00:19:41.787 "rw_ios_per_sec": 0, 00:19:41.787 "rw_mbytes_per_sec": 0, 00:19:41.787 "r_mbytes_per_sec": 0, 00:19:41.787 "w_mbytes_per_sec": 0 00:19:41.787 }, 00:19:41.787 "claimed": true, 00:19:41.787 "claim_type": "exclusive_write", 00:19:41.787 "zoned": false, 00:19:41.787 "supported_io_types": { 00:19:41.787 "read": true, 00:19:41.787 "write": true, 00:19:41.787 "unmap": true, 00:19:41.787 "flush": true, 00:19:41.787 "reset": true, 00:19:41.787 "nvme_admin": false, 00:19:41.787 "nvme_io": false, 00:19:41.787 "nvme_io_md": false, 00:19:41.787 "write_zeroes": true, 00:19:41.787 "zcopy": true, 00:19:41.787 "get_zone_info": false, 00:19:41.787 "zone_management": false, 00:19:41.787 "zone_append": false, 00:19:41.787 "compare": false, 00:19:41.787 "compare_and_write": false, 00:19:41.787 "abort": true, 00:19:41.787 "seek_hole": false, 00:19:41.787 "seek_data": false, 00:19:41.787 "copy": true, 00:19:41.787 "nvme_iov_md": false 00:19:41.787 }, 00:19:41.787 "memory_domains": [ 00:19:41.787 { 00:19:41.787 "dma_device_id": "system", 00:19:41.787 "dma_device_type": 1 00:19:41.787 }, 00:19:41.787 { 00:19:41.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.787 "dma_device_type": 2 00:19:41.787 } 00:19:41.787 ], 00:19:41.787 "driver_specific": { 00:19:41.787 "passthru": { 00:19:41.787 "name": "pt3", 00:19:41.787 "base_bdev_name": "malloc3" 00:19:41.787 } 00:19:41.787 } 00:19:41.787 }' 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:41.787 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:42.047 22:48:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.306 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.306 "name": "pt4", 00:19:42.306 "aliases": [ 00:19:42.306 "00000000-0000-0000-0000-000000000004" 00:19:42.306 ], 00:19:42.306 "product_name": "passthru", 00:19:42.306 "block_size": 512, 00:19:42.306 "num_blocks": 65536, 00:19:42.306 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:42.306 "assigned_rate_limits": { 00:19:42.306 "rw_ios_per_sec": 0, 00:19:42.306 "rw_mbytes_per_sec": 0, 00:19:42.306 "r_mbytes_per_sec": 0, 00:19:42.306 "w_mbytes_per_sec": 0 00:19:42.306 }, 00:19:42.306 "claimed": true, 00:19:42.306 "claim_type": "exclusive_write", 00:19:42.306 "zoned": false, 00:19:42.306 "supported_io_types": { 00:19:42.306 "read": true, 00:19:42.306 "write": true, 00:19:42.306 "unmap": true, 00:19:42.306 "flush": true, 00:19:42.306 "reset": true, 00:19:42.306 "nvme_admin": false, 00:19:42.306 "nvme_io": false, 00:19:42.306 "nvme_io_md": false, 00:19:42.306 "write_zeroes": true, 00:19:42.306 "zcopy": true, 00:19:42.306 "get_zone_info": false, 00:19:42.306 "zone_management": false, 00:19:42.306 "zone_append": false, 00:19:42.306 "compare": false, 00:19:42.306 "compare_and_write": false, 00:19:42.306 "abort": true, 00:19:42.306 "seek_hole": false, 00:19:42.306 "seek_data": false, 00:19:42.306 "copy": true, 00:19:42.306 "nvme_iov_md": false 00:19:42.306 }, 00:19:42.306 "memory_domains": [ 00:19:42.306 { 00:19:42.306 "dma_device_id": "system", 00:19:42.306 "dma_device_type": 1 00:19:42.306 }, 00:19:42.306 { 00:19:42.306 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.306 "dma_device_type": 2 00:19:42.306 } 00:19:42.306 ], 00:19:42.306 "driver_specific": { 00:19:42.306 "passthru": { 00:19:42.306 "name": "pt4", 00:19:42.306 "base_bdev_name": "malloc4" 00:19:42.306 } 00:19:42.306 } 00:19:42.306 }' 00:19:42.306 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.306 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.306 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.306 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.565 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.565 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.565 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.565 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.565 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.565 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.565 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.566 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.566 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:42.566 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:42.825 [2024-07-15 22:48:27.666048] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a6c9583f-5590-4506-a18f-068aec84af12 '!=' a6c9583f-5590-4506-a18f-068aec84af12 ']' 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2771371 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2771371 ']' 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2771371 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2771371 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2771371' 00:19:42.825 killing process with pid 2771371 00:19:42.825 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2771371 00:19:42.825 [2024-07-15 22:48:27.734154] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:42.825 [2024-07-15 22:48:27.734219] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:42.825 [2024-07-15 22:48:27.734284] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:42.825 [2024-07-15 22:48:27.734296] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1531780 name raid_bdev1, state offline 00:19:43.084 22:48:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2771371 00:19:43.084 [2024-07-15 22:48:27.777658] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:43.343 22:48:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:43.343 00:19:43.343 real 0m18.293s 00:19:43.343 user 0m33.151s 00:19:43.343 sys 0m3.141s 00:19:43.343 22:48:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:43.343 22:48:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.343 ************************************ 00:19:43.343 END TEST raid_superblock_test 00:19:43.343 ************************************ 00:19:43.343 22:48:28 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:43.343 22:48:28 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:19:43.343 22:48:28 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:43.343 22:48:28 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:43.343 22:48:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:43.343 ************************************ 00:19:43.343 START TEST raid_read_error_test 00:19:43.343 ************************************ 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yNmj4GZfTY 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2773993 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2773993 /var/tmp/spdk-raid.sock 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2773993 ']' 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:43.343 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:43.343 22:48:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.343 [2024-07-15 22:48:28.166129] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:19:43.343 [2024-07-15 22:48:28.166198] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2773993 ] 00:19:43.602 [2024-07-15 22:48:28.286017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:43.602 [2024-07-15 22:48:28.394836] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:43.602 [2024-07-15 22:48:28.462679] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:43.602 [2024-07-15 22:48:28.462713] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:44.538 22:48:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:44.539 22:48:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:44.539 22:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:44.539 22:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:44.539 BaseBdev1_malloc 00:19:44.539 22:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:44.796 true 00:19:44.796 22:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:45.054 [2024-07-15 22:48:29.853278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:45.054 [2024-07-15 22:48:29.853324] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:45.054 [2024-07-15 22:48:29.853349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12260d0 00:19:45.054 [2024-07-15 22:48:29.853361] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:45.054 [2024-07-15 22:48:29.855280] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:45.054 [2024-07-15 22:48:29.855312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:45.054 BaseBdev1 00:19:45.054 22:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:45.054 22:48:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:45.311 BaseBdev2_malloc 00:19:45.311 22:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:45.568 true 00:19:45.568 22:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:45.825 [2024-07-15 22:48:30.656072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:45.825 [2024-07-15 22:48:30.656120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:45.825 [2024-07-15 22:48:30.656143] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x122a910 00:19:45.825 [2024-07-15 22:48:30.656156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:45.825 [2024-07-15 22:48:30.657762] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:45.825 [2024-07-15 22:48:30.657790] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:45.825 BaseBdev2 00:19:45.825 22:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:45.825 22:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:46.082 BaseBdev3_malloc 00:19:46.082 22:48:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:46.339 true 00:19:46.340 22:48:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:46.597 [2024-07-15 22:48:31.395863] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:46.597 [2024-07-15 22:48:31.395905] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:46.597 [2024-07-15 22:48:31.395934] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x122cbd0 00:19:46.597 [2024-07-15 22:48:31.395947] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:46.597 [2024-07-15 22:48:31.397486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:46.597 [2024-07-15 22:48:31.397516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:46.597 BaseBdev3 00:19:46.597 22:48:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:46.597 22:48:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:46.854 BaseBdev4_malloc 00:19:46.854 22:48:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:47.112 true 00:19:47.112 22:48:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:47.369 [2024-07-15 22:48:32.130499] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:47.369 [2024-07-15 22:48:32.130543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.369 [2024-07-15 22:48:32.130568] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x122daa0 00:19:47.369 [2024-07-15 22:48:32.130581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.369 [2024-07-15 22:48:32.132183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.369 [2024-07-15 22:48:32.132214] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:47.369 BaseBdev4 00:19:47.369 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:47.627 [2024-07-15 22:48:32.363170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:47.627 [2024-07-15 22:48:32.364531] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:47.627 [2024-07-15 22:48:32.364601] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:47.627 [2024-07-15 22:48:32.364663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:47.627 [2024-07-15 22:48:32.364902] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1227c20 00:19:47.627 [2024-07-15 22:48:32.364914] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:47.627 [2024-07-15 22:48:32.365129] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x107c260 00:19:47.627 [2024-07-15 22:48:32.365287] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1227c20 00:19:47.627 [2024-07-15 22:48:32.365297] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1227c20 00:19:47.627 [2024-07-15 22:48:32.365408] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:47.627 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:47.627 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:47.627 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:47.627 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.628 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:47.886 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.886 "name": "raid_bdev1", 00:19:47.886 "uuid": "3839f58e-ceaf-4b04-a50b-7c7daea23bc3", 00:19:47.886 "strip_size_kb": 64, 00:19:47.886 "state": "online", 00:19:47.886 "raid_level": "raid0", 00:19:47.886 "superblock": true, 00:19:47.886 "num_base_bdevs": 4, 00:19:47.886 "num_base_bdevs_discovered": 4, 00:19:47.886 "num_base_bdevs_operational": 4, 00:19:47.886 "base_bdevs_list": [ 00:19:47.886 { 00:19:47.886 "name": "BaseBdev1", 00:19:47.886 "uuid": "d10103be-8378-5735-b095-5267a74fbf58", 00:19:47.886 "is_configured": true, 00:19:47.886 "data_offset": 2048, 00:19:47.886 "data_size": 63488 00:19:47.886 }, 00:19:47.886 { 00:19:47.886 "name": "BaseBdev2", 00:19:47.886 "uuid": "fc09c090-0f00-59eb-bec5-1070e25bf96c", 00:19:47.886 "is_configured": true, 00:19:47.886 "data_offset": 2048, 00:19:47.886 "data_size": 63488 00:19:47.886 }, 00:19:47.886 { 00:19:47.886 "name": "BaseBdev3", 00:19:47.886 "uuid": "63958593-1f9a-5457-870f-196bbabbf14a", 00:19:47.886 "is_configured": true, 00:19:47.886 "data_offset": 2048, 00:19:47.886 "data_size": 63488 00:19:47.886 }, 00:19:47.886 { 00:19:47.886 "name": "BaseBdev4", 00:19:47.886 "uuid": "97c60756-b466-555c-9e13-a9bc03e61360", 00:19:47.886 "is_configured": true, 00:19:47.886 "data_offset": 2048, 00:19:47.886 "data_size": 63488 00:19:47.886 } 00:19:47.886 ] 00:19:47.886 }' 00:19:47.886 22:48:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.886 22:48:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.455 22:48:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:48.455 22:48:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:48.455 [2024-07-15 22:48:33.309955] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1219fc0 00:19:49.391 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.650 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.909 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.909 "name": "raid_bdev1", 00:19:49.909 "uuid": "3839f58e-ceaf-4b04-a50b-7c7daea23bc3", 00:19:49.909 "strip_size_kb": 64, 00:19:49.909 "state": "online", 00:19:49.909 "raid_level": "raid0", 00:19:49.909 "superblock": true, 00:19:49.909 "num_base_bdevs": 4, 00:19:49.909 "num_base_bdevs_discovered": 4, 00:19:49.909 "num_base_bdevs_operational": 4, 00:19:49.909 "base_bdevs_list": [ 00:19:49.909 { 00:19:49.909 "name": "BaseBdev1", 00:19:49.909 "uuid": "d10103be-8378-5735-b095-5267a74fbf58", 00:19:49.909 "is_configured": true, 00:19:49.909 "data_offset": 2048, 00:19:49.909 "data_size": 63488 00:19:49.909 }, 00:19:49.909 { 00:19:49.909 "name": "BaseBdev2", 00:19:49.909 "uuid": "fc09c090-0f00-59eb-bec5-1070e25bf96c", 00:19:49.909 "is_configured": true, 00:19:49.909 "data_offset": 2048, 00:19:49.909 "data_size": 63488 00:19:49.909 }, 00:19:49.909 { 00:19:49.909 "name": "BaseBdev3", 00:19:49.909 "uuid": "63958593-1f9a-5457-870f-196bbabbf14a", 00:19:49.909 "is_configured": true, 00:19:49.909 "data_offset": 2048, 00:19:49.909 "data_size": 63488 00:19:49.909 }, 00:19:49.909 { 00:19:49.909 "name": "BaseBdev4", 00:19:49.909 "uuid": "97c60756-b466-555c-9e13-a9bc03e61360", 00:19:49.910 "is_configured": true, 00:19:49.910 "data_offset": 2048, 00:19:49.910 "data_size": 63488 00:19:49.910 } 00:19:49.910 ] 00:19:49.910 }' 00:19:49.910 22:48:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.910 22:48:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.477 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:50.736 [2024-07-15 22:48:35.544373] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:50.736 [2024-07-15 22:48:35.544411] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:50.736 [2024-07-15 22:48:35.547606] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:50.736 [2024-07-15 22:48:35.547645] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:50.736 [2024-07-15 22:48:35.547686] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:50.736 [2024-07-15 22:48:35.547698] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1227c20 name raid_bdev1, state offline 00:19:50.736 0 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2773993 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2773993 ']' 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2773993 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2773993 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2773993' 00:19:50.736 killing process with pid 2773993 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2773993 00:19:50.736 [2024-07-15 22:48:35.615717] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:50.736 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2773993 00:19:50.995 [2024-07-15 22:48:35.652527] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yNmj4GZfTY 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:19:50.995 00:19:50.995 real 0m7.805s 00:19:50.995 user 0m12.537s 00:19:50.995 sys 0m1.355s 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:50.995 22:48:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.995 ************************************ 00:19:50.995 END TEST raid_read_error_test 00:19:50.995 ************************************ 00:19:51.254 22:48:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:51.254 22:48:35 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:51.254 22:48:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:51.254 22:48:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:51.254 22:48:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:51.254 ************************************ 00:19:51.254 START TEST raid_write_error_test 00:19:51.254 ************************************ 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:51.254 22:48:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.N40NQoQbor 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2775122 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2775122 /var/tmp/spdk-raid.sock 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2775122 ']' 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:51.254 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:51.254 22:48:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:51.254 [2024-07-15 22:48:36.066978] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:19:51.254 [2024-07-15 22:48:36.067054] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2775122 ] 00:19:51.513 [2024-07-15 22:48:36.198165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:51.513 [2024-07-15 22:48:36.302888] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:51.513 [2024-07-15 22:48:36.360639] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:51.513 [2024-07-15 22:48:36.360669] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:52.449 22:48:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:52.450 22:48:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:52.450 22:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:52.450 22:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:52.450 BaseBdev1_malloc 00:19:52.450 22:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:52.709 true 00:19:52.709 22:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:52.968 [2024-07-15 22:48:37.733914] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:52.968 [2024-07-15 22:48:37.733969] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:52.968 [2024-07-15 22:48:37.733990] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23850d0 00:19:52.968 [2024-07-15 22:48:37.734003] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:52.968 [2024-07-15 22:48:37.735745] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:52.968 [2024-07-15 22:48:37.735778] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:52.968 BaseBdev1 00:19:52.968 22:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:52.968 22:48:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:53.226 BaseBdev2_malloc 00:19:53.226 22:48:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:53.485 true 00:19:53.485 22:48:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:53.744 [2024-07-15 22:48:38.484572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:53.744 [2024-07-15 22:48:38.484619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:53.744 [2024-07-15 22:48:38.484639] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2389910 00:19:53.744 [2024-07-15 22:48:38.484652] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:53.744 [2024-07-15 22:48:38.486098] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:53.744 [2024-07-15 22:48:38.486129] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:53.744 BaseBdev2 00:19:53.744 22:48:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:53.744 22:48:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:54.056 BaseBdev3_malloc 00:19:54.056 22:48:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:54.315 true 00:19:54.315 22:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:54.574 [2024-07-15 22:48:39.231199] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:54.574 [2024-07-15 22:48:39.231250] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:54.574 [2024-07-15 22:48:39.231270] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238bbd0 00:19:54.574 [2024-07-15 22:48:39.231284] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:54.574 [2024-07-15 22:48:39.232732] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:54.574 [2024-07-15 22:48:39.232762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:54.574 BaseBdev3 00:19:54.574 22:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:54.574 22:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:54.832 BaseBdev4_malloc 00:19:54.832 22:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:54.832 true 00:19:55.092 22:48:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:55.092 [2024-07-15 22:48:39.982051] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:55.092 [2024-07-15 22:48:39.982102] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.092 [2024-07-15 22:48:39.982123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238caa0 00:19:55.092 [2024-07-15 22:48:39.982136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.092 [2024-07-15 22:48:39.983597] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.092 [2024-07-15 22:48:39.983627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:55.092 BaseBdev4 00:19:55.350 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:55.350 [2024-07-15 22:48:40.222731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:55.350 [2024-07-15 22:48:40.224029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:55.351 [2024-07-15 22:48:40.224100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:55.351 [2024-07-15 22:48:40.224162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:55.351 [2024-07-15 22:48:40.224397] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2386c20 00:19:55.351 [2024-07-15 22:48:40.224409] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:55.351 [2024-07-15 22:48:40.224610] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21db260 00:19:55.351 [2024-07-15 22:48:40.224761] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2386c20 00:19:55.351 [2024-07-15 22:48:40.224771] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2386c20 00:19:55.351 [2024-07-15 22:48:40.224875] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.351 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:55.609 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.609 "name": "raid_bdev1", 00:19:55.609 "uuid": "95e74a16-1c0f-4245-bbc5-6a85cb7b475b", 00:19:55.609 "strip_size_kb": 64, 00:19:55.609 "state": "online", 00:19:55.609 "raid_level": "raid0", 00:19:55.609 "superblock": true, 00:19:55.609 "num_base_bdevs": 4, 00:19:55.609 "num_base_bdevs_discovered": 4, 00:19:55.609 "num_base_bdevs_operational": 4, 00:19:55.609 "base_bdevs_list": [ 00:19:55.609 { 00:19:55.609 "name": "BaseBdev1", 00:19:55.609 "uuid": "c9299925-0f6a-5dcd-884b-572a90f1051f", 00:19:55.609 "is_configured": true, 00:19:55.609 "data_offset": 2048, 00:19:55.609 "data_size": 63488 00:19:55.609 }, 00:19:55.609 { 00:19:55.609 "name": "BaseBdev2", 00:19:55.609 "uuid": "7b38e358-de56-5686-afd6-f580a3fe7589", 00:19:55.609 "is_configured": true, 00:19:55.609 "data_offset": 2048, 00:19:55.609 "data_size": 63488 00:19:55.609 }, 00:19:55.609 { 00:19:55.609 "name": "BaseBdev3", 00:19:55.609 "uuid": "5f808226-69c3-58ef-8cb1-f407441c63ad", 00:19:55.609 "is_configured": true, 00:19:55.609 "data_offset": 2048, 00:19:55.609 "data_size": 63488 00:19:55.609 }, 00:19:55.609 { 00:19:55.609 "name": "BaseBdev4", 00:19:55.610 "uuid": "e245a669-da87-50f7-bf2a-611df755c422", 00:19:55.610 "is_configured": true, 00:19:55.610 "data_offset": 2048, 00:19:55.610 "data_size": 63488 00:19:55.610 } 00:19:55.610 ] 00:19:55.610 }' 00:19:55.610 22:48:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.610 22:48:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.548 22:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:56.548 22:48:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:56.548 [2024-07-15 22:48:41.181565] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2378fc0 00:19:57.484 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:57.484 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:57.484 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:57.484 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.485 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:57.742 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:57.742 "name": "raid_bdev1", 00:19:57.742 "uuid": "95e74a16-1c0f-4245-bbc5-6a85cb7b475b", 00:19:57.742 "strip_size_kb": 64, 00:19:57.742 "state": "online", 00:19:57.742 "raid_level": "raid0", 00:19:57.742 "superblock": true, 00:19:57.742 "num_base_bdevs": 4, 00:19:57.742 "num_base_bdevs_discovered": 4, 00:19:57.742 "num_base_bdevs_operational": 4, 00:19:57.742 "base_bdevs_list": [ 00:19:57.742 { 00:19:57.742 "name": "BaseBdev1", 00:19:57.742 "uuid": "c9299925-0f6a-5dcd-884b-572a90f1051f", 00:19:57.742 "is_configured": true, 00:19:57.742 "data_offset": 2048, 00:19:57.742 "data_size": 63488 00:19:57.742 }, 00:19:57.742 { 00:19:57.742 "name": "BaseBdev2", 00:19:57.742 "uuid": "7b38e358-de56-5686-afd6-f580a3fe7589", 00:19:57.742 "is_configured": true, 00:19:57.742 "data_offset": 2048, 00:19:57.742 "data_size": 63488 00:19:57.742 }, 00:19:57.742 { 00:19:57.742 "name": "BaseBdev3", 00:19:57.742 "uuid": "5f808226-69c3-58ef-8cb1-f407441c63ad", 00:19:57.742 "is_configured": true, 00:19:57.742 "data_offset": 2048, 00:19:57.742 "data_size": 63488 00:19:57.742 }, 00:19:57.742 { 00:19:57.742 "name": "BaseBdev4", 00:19:57.742 "uuid": "e245a669-da87-50f7-bf2a-611df755c422", 00:19:57.742 "is_configured": true, 00:19:57.742 "data_offset": 2048, 00:19:57.742 "data_size": 63488 00:19:57.742 } 00:19:57.742 ] 00:19:57.742 }' 00:19:57.742 22:48:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:57.742 22:48:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.310 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:58.569 [2024-07-15 22:48:43.302303] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:58.569 [2024-07-15 22:48:43.302345] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:58.569 [2024-07-15 22:48:43.305532] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:58.569 [2024-07-15 22:48:43.305574] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:58.569 [2024-07-15 22:48:43.305615] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:58.569 [2024-07-15 22:48:43.305627] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2386c20 name raid_bdev1, state offline 00:19:58.569 0 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2775122 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2775122 ']' 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2775122 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2775122 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2775122' 00:19:58.569 killing process with pid 2775122 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2775122 00:19:58.569 [2024-07-15 22:48:43.387336] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:58.569 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2775122 00:19:58.569 [2024-07-15 22:48:43.419748] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.N40NQoQbor 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:19:58.827 00:19:58.827 real 0m7.675s 00:19:58.827 user 0m12.220s 00:19:58.827 sys 0m1.382s 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:58.827 22:48:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.827 ************************************ 00:19:58.827 END TEST raid_write_error_test 00:19:58.827 ************************************ 00:19:58.827 22:48:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:58.827 22:48:43 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:58.827 22:48:43 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:58.827 22:48:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:58.827 22:48:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:58.827 22:48:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:59.086 ************************************ 00:19:59.086 START TEST raid_state_function_test 00:19:59.086 ************************************ 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:59.086 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2776263 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2776263' 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:59.087 Process raid pid: 2776263 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2776263 /var/tmp/spdk-raid.sock 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2776263 ']' 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:59.087 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:59.087 22:48:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:59.087 [2024-07-15 22:48:43.806369] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:19:59.087 [2024-07-15 22:48:43.806426] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:59.087 [2024-07-15 22:48:43.921254] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:59.346 [2024-07-15 22:48:44.031057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:59.346 [2024-07-15 22:48:44.094984] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:59.346 [2024-07-15 22:48:44.095018] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:59.912 22:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:59.912 22:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:59.912 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:00.171 [2024-07-15 22:48:44.848231] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:00.171 [2024-07-15 22:48:44.848281] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:00.171 [2024-07-15 22:48:44.848292] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:00.171 [2024-07-15 22:48:44.848303] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:00.171 [2024-07-15 22:48:44.848312] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:00.171 [2024-07-15 22:48:44.848324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:00.171 [2024-07-15 22:48:44.848332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:00.171 [2024-07-15 22:48:44.848343] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.171 22:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:00.430 22:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:00.430 "name": "Existed_Raid", 00:20:00.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.430 "strip_size_kb": 64, 00:20:00.430 "state": "configuring", 00:20:00.430 "raid_level": "concat", 00:20:00.430 "superblock": false, 00:20:00.430 "num_base_bdevs": 4, 00:20:00.430 "num_base_bdevs_discovered": 0, 00:20:00.430 "num_base_bdevs_operational": 4, 00:20:00.430 "base_bdevs_list": [ 00:20:00.430 { 00:20:00.430 "name": "BaseBdev1", 00:20:00.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.430 "is_configured": false, 00:20:00.430 "data_offset": 0, 00:20:00.430 "data_size": 0 00:20:00.430 }, 00:20:00.430 { 00:20:00.430 "name": "BaseBdev2", 00:20:00.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.430 "is_configured": false, 00:20:00.430 "data_offset": 0, 00:20:00.430 "data_size": 0 00:20:00.430 }, 00:20:00.430 { 00:20:00.430 "name": "BaseBdev3", 00:20:00.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.430 "is_configured": false, 00:20:00.430 "data_offset": 0, 00:20:00.430 "data_size": 0 00:20:00.430 }, 00:20:00.430 { 00:20:00.430 "name": "BaseBdev4", 00:20:00.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.430 "is_configured": false, 00:20:00.430 "data_offset": 0, 00:20:00.430 "data_size": 0 00:20:00.430 } 00:20:00.430 ] 00:20:00.430 }' 00:20:00.430 22:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:00.430 22:48:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.998 22:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:00.998 [2024-07-15 22:48:45.790604] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:00.998 [2024-07-15 22:48:45.790639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d4aa0 name Existed_Raid, state configuring 00:20:00.998 22:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:01.257 [2024-07-15 22:48:45.963089] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:01.257 [2024-07-15 22:48:45.963121] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:01.257 [2024-07-15 22:48:45.963131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:01.257 [2024-07-15 22:48:45.963143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:01.257 [2024-07-15 22:48:45.963151] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:01.257 [2024-07-15 22:48:45.963163] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:01.257 [2024-07-15 22:48:45.963172] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:01.257 [2024-07-15 22:48:45.963183] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:01.257 22:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:01.515 [2024-07-15 22:48:46.221785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:01.515 BaseBdev1 00:20:01.515 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:01.515 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:01.515 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.515 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:01.515 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.515 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.515 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:01.774 [ 00:20:01.774 { 00:20:01.774 "name": "BaseBdev1", 00:20:01.774 "aliases": [ 00:20:01.774 "230e04f3-27b7-4b54-8a4a-258e3b3872ec" 00:20:01.774 ], 00:20:01.774 "product_name": "Malloc disk", 00:20:01.774 "block_size": 512, 00:20:01.774 "num_blocks": 65536, 00:20:01.774 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:01.774 "assigned_rate_limits": { 00:20:01.774 "rw_ios_per_sec": 0, 00:20:01.774 "rw_mbytes_per_sec": 0, 00:20:01.774 "r_mbytes_per_sec": 0, 00:20:01.774 "w_mbytes_per_sec": 0 00:20:01.774 }, 00:20:01.774 "claimed": true, 00:20:01.774 "claim_type": "exclusive_write", 00:20:01.774 "zoned": false, 00:20:01.774 "supported_io_types": { 00:20:01.774 "read": true, 00:20:01.774 "write": true, 00:20:01.774 "unmap": true, 00:20:01.774 "flush": true, 00:20:01.774 "reset": true, 00:20:01.774 "nvme_admin": false, 00:20:01.774 "nvme_io": false, 00:20:01.774 "nvme_io_md": false, 00:20:01.774 "write_zeroes": true, 00:20:01.774 "zcopy": true, 00:20:01.774 "get_zone_info": false, 00:20:01.774 "zone_management": false, 00:20:01.774 "zone_append": false, 00:20:01.774 "compare": false, 00:20:01.774 "compare_and_write": false, 00:20:01.774 "abort": true, 00:20:01.774 "seek_hole": false, 00:20:01.774 "seek_data": false, 00:20:01.774 "copy": true, 00:20:01.774 "nvme_iov_md": false 00:20:01.774 }, 00:20:01.774 "memory_domains": [ 00:20:01.774 { 00:20:01.774 "dma_device_id": "system", 00:20:01.774 "dma_device_type": 1 00:20:01.774 }, 00:20:01.774 { 00:20:01.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.774 "dma_device_type": 2 00:20:01.774 } 00:20:01.774 ], 00:20:01.774 "driver_specific": {} 00:20:01.774 } 00:20:01.774 ] 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.774 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:02.033 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.033 "name": "Existed_Raid", 00:20:02.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.033 "strip_size_kb": 64, 00:20:02.033 "state": "configuring", 00:20:02.033 "raid_level": "concat", 00:20:02.033 "superblock": false, 00:20:02.033 "num_base_bdevs": 4, 00:20:02.033 "num_base_bdevs_discovered": 1, 00:20:02.033 "num_base_bdevs_operational": 4, 00:20:02.033 "base_bdevs_list": [ 00:20:02.033 { 00:20:02.033 "name": "BaseBdev1", 00:20:02.033 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:02.033 "is_configured": true, 00:20:02.033 "data_offset": 0, 00:20:02.033 "data_size": 65536 00:20:02.033 }, 00:20:02.033 { 00:20:02.033 "name": "BaseBdev2", 00:20:02.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.033 "is_configured": false, 00:20:02.033 "data_offset": 0, 00:20:02.033 "data_size": 0 00:20:02.033 }, 00:20:02.033 { 00:20:02.033 "name": "BaseBdev3", 00:20:02.033 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.034 "is_configured": false, 00:20:02.034 "data_offset": 0, 00:20:02.034 "data_size": 0 00:20:02.034 }, 00:20:02.034 { 00:20:02.034 "name": "BaseBdev4", 00:20:02.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.034 "is_configured": false, 00:20:02.034 "data_offset": 0, 00:20:02.034 "data_size": 0 00:20:02.034 } 00:20:02.034 ] 00:20:02.034 }' 00:20:02.034 22:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.034 22:48:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.601 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:02.860 [2024-07-15 22:48:47.697694] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:02.860 [2024-07-15 22:48:47.697745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d4310 name Existed_Raid, state configuring 00:20:02.860 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:03.120 [2024-07-15 22:48:47.942399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:03.120 [2024-07-15 22:48:47.943917] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:03.120 [2024-07-15 22:48:47.943964] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:03.120 [2024-07-15 22:48:47.943975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:03.120 [2024-07-15 22:48:47.943986] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:03.120 [2024-07-15 22:48:47.943995] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:03.120 [2024-07-15 22:48:47.944007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.120 22:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.379 22:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.379 "name": "Existed_Raid", 00:20:03.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.379 "strip_size_kb": 64, 00:20:03.379 "state": "configuring", 00:20:03.379 "raid_level": "concat", 00:20:03.379 "superblock": false, 00:20:03.379 "num_base_bdevs": 4, 00:20:03.379 "num_base_bdevs_discovered": 1, 00:20:03.379 "num_base_bdevs_operational": 4, 00:20:03.379 "base_bdevs_list": [ 00:20:03.379 { 00:20:03.379 "name": "BaseBdev1", 00:20:03.379 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:03.379 "is_configured": true, 00:20:03.379 "data_offset": 0, 00:20:03.379 "data_size": 65536 00:20:03.379 }, 00:20:03.379 { 00:20:03.379 "name": "BaseBdev2", 00:20:03.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.379 "is_configured": false, 00:20:03.379 "data_offset": 0, 00:20:03.379 "data_size": 0 00:20:03.379 }, 00:20:03.379 { 00:20:03.379 "name": "BaseBdev3", 00:20:03.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.379 "is_configured": false, 00:20:03.379 "data_offset": 0, 00:20:03.379 "data_size": 0 00:20:03.379 }, 00:20:03.379 { 00:20:03.379 "name": "BaseBdev4", 00:20:03.379 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.379 "is_configured": false, 00:20:03.379 "data_offset": 0, 00:20:03.379 "data_size": 0 00:20:03.379 } 00:20:03.379 ] 00:20:03.379 }' 00:20:03.379 22:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.379 22:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.944 22:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:04.509 [2024-07-15 22:48:49.293362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:04.509 BaseBdev2 00:20:04.509 22:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:04.509 22:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:04.509 22:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:04.509 22:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:04.509 22:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:04.509 22:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:04.509 22:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:05.076 22:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:05.644 [ 00:20:05.644 { 00:20:05.644 "name": "BaseBdev2", 00:20:05.644 "aliases": [ 00:20:05.644 "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d" 00:20:05.644 ], 00:20:05.644 "product_name": "Malloc disk", 00:20:05.644 "block_size": 512, 00:20:05.644 "num_blocks": 65536, 00:20:05.644 "uuid": "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d", 00:20:05.644 "assigned_rate_limits": { 00:20:05.644 "rw_ios_per_sec": 0, 00:20:05.644 "rw_mbytes_per_sec": 0, 00:20:05.644 "r_mbytes_per_sec": 0, 00:20:05.644 "w_mbytes_per_sec": 0 00:20:05.644 }, 00:20:05.644 "claimed": true, 00:20:05.644 "claim_type": "exclusive_write", 00:20:05.644 "zoned": false, 00:20:05.644 "supported_io_types": { 00:20:05.644 "read": true, 00:20:05.644 "write": true, 00:20:05.644 "unmap": true, 00:20:05.644 "flush": true, 00:20:05.644 "reset": true, 00:20:05.644 "nvme_admin": false, 00:20:05.644 "nvme_io": false, 00:20:05.644 "nvme_io_md": false, 00:20:05.644 "write_zeroes": true, 00:20:05.644 "zcopy": true, 00:20:05.644 "get_zone_info": false, 00:20:05.644 "zone_management": false, 00:20:05.644 "zone_append": false, 00:20:05.644 "compare": false, 00:20:05.644 "compare_and_write": false, 00:20:05.644 "abort": true, 00:20:05.644 "seek_hole": false, 00:20:05.644 "seek_data": false, 00:20:05.644 "copy": true, 00:20:05.644 "nvme_iov_md": false 00:20:05.644 }, 00:20:05.644 "memory_domains": [ 00:20:05.644 { 00:20:05.644 "dma_device_id": "system", 00:20:05.644 "dma_device_type": 1 00:20:05.644 }, 00:20:05.644 { 00:20:05.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.644 "dma_device_type": 2 00:20:05.644 } 00:20:05.644 ], 00:20:05.644 "driver_specific": {} 00:20:05.644 } 00:20:05.644 ] 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.644 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:06.211 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:06.211 "name": "Existed_Raid", 00:20:06.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.211 "strip_size_kb": 64, 00:20:06.211 "state": "configuring", 00:20:06.211 "raid_level": "concat", 00:20:06.211 "superblock": false, 00:20:06.211 "num_base_bdevs": 4, 00:20:06.211 "num_base_bdevs_discovered": 2, 00:20:06.211 "num_base_bdevs_operational": 4, 00:20:06.211 "base_bdevs_list": [ 00:20:06.211 { 00:20:06.211 "name": "BaseBdev1", 00:20:06.211 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:06.211 "is_configured": true, 00:20:06.211 "data_offset": 0, 00:20:06.211 "data_size": 65536 00:20:06.211 }, 00:20:06.211 { 00:20:06.211 "name": "BaseBdev2", 00:20:06.211 "uuid": "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d", 00:20:06.211 "is_configured": true, 00:20:06.211 "data_offset": 0, 00:20:06.211 "data_size": 65536 00:20:06.211 }, 00:20:06.211 { 00:20:06.211 "name": "BaseBdev3", 00:20:06.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.211 "is_configured": false, 00:20:06.211 "data_offset": 0, 00:20:06.211 "data_size": 0 00:20:06.211 }, 00:20:06.211 { 00:20:06.211 "name": "BaseBdev4", 00:20:06.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:06.211 "is_configured": false, 00:20:06.211 "data_offset": 0, 00:20:06.211 "data_size": 0 00:20:06.211 } 00:20:06.211 ] 00:20:06.211 }' 00:20:06.211 22:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:06.211 22:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.784 22:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:07.042 [2024-07-15 22:48:51.704639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:07.042 BaseBdev3 00:20:07.042 22:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:07.042 22:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:07.042 22:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:07.042 22:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:07.042 22:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:07.042 22:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:07.042 22:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:07.299 22:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:07.299 [ 00:20:07.299 { 00:20:07.299 "name": "BaseBdev3", 00:20:07.299 "aliases": [ 00:20:07.299 "068f8034-fec7-48d1-8ddd-cb645ed84b10" 00:20:07.299 ], 00:20:07.299 "product_name": "Malloc disk", 00:20:07.299 "block_size": 512, 00:20:07.299 "num_blocks": 65536, 00:20:07.299 "uuid": "068f8034-fec7-48d1-8ddd-cb645ed84b10", 00:20:07.299 "assigned_rate_limits": { 00:20:07.299 "rw_ios_per_sec": 0, 00:20:07.299 "rw_mbytes_per_sec": 0, 00:20:07.299 "r_mbytes_per_sec": 0, 00:20:07.299 "w_mbytes_per_sec": 0 00:20:07.299 }, 00:20:07.299 "claimed": true, 00:20:07.299 "claim_type": "exclusive_write", 00:20:07.299 "zoned": false, 00:20:07.299 "supported_io_types": { 00:20:07.299 "read": true, 00:20:07.299 "write": true, 00:20:07.299 "unmap": true, 00:20:07.299 "flush": true, 00:20:07.299 "reset": true, 00:20:07.299 "nvme_admin": false, 00:20:07.299 "nvme_io": false, 00:20:07.299 "nvme_io_md": false, 00:20:07.299 "write_zeroes": true, 00:20:07.299 "zcopy": true, 00:20:07.299 "get_zone_info": false, 00:20:07.299 "zone_management": false, 00:20:07.299 "zone_append": false, 00:20:07.299 "compare": false, 00:20:07.299 "compare_and_write": false, 00:20:07.299 "abort": true, 00:20:07.299 "seek_hole": false, 00:20:07.299 "seek_data": false, 00:20:07.299 "copy": true, 00:20:07.299 "nvme_iov_md": false 00:20:07.299 }, 00:20:07.299 "memory_domains": [ 00:20:07.299 { 00:20:07.299 "dma_device_id": "system", 00:20:07.299 "dma_device_type": 1 00:20:07.299 }, 00:20:07.299 { 00:20:07.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.299 "dma_device_type": 2 00:20:07.299 } 00:20:07.299 ], 00:20:07.299 "driver_specific": {} 00:20:07.299 } 00:20:07.299 ] 00:20:07.299 22:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:07.299 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:07.299 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:07.299 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.557 "name": "Existed_Raid", 00:20:07.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.557 "strip_size_kb": 64, 00:20:07.557 "state": "configuring", 00:20:07.557 "raid_level": "concat", 00:20:07.557 "superblock": false, 00:20:07.557 "num_base_bdevs": 4, 00:20:07.557 "num_base_bdevs_discovered": 3, 00:20:07.557 "num_base_bdevs_operational": 4, 00:20:07.557 "base_bdevs_list": [ 00:20:07.557 { 00:20:07.557 "name": "BaseBdev1", 00:20:07.557 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:07.557 "is_configured": true, 00:20:07.557 "data_offset": 0, 00:20:07.557 "data_size": 65536 00:20:07.557 }, 00:20:07.557 { 00:20:07.557 "name": "BaseBdev2", 00:20:07.557 "uuid": "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d", 00:20:07.557 "is_configured": true, 00:20:07.557 "data_offset": 0, 00:20:07.557 "data_size": 65536 00:20:07.557 }, 00:20:07.557 { 00:20:07.557 "name": "BaseBdev3", 00:20:07.557 "uuid": "068f8034-fec7-48d1-8ddd-cb645ed84b10", 00:20:07.557 "is_configured": true, 00:20:07.557 "data_offset": 0, 00:20:07.557 "data_size": 65536 00:20:07.557 }, 00:20:07.557 { 00:20:07.557 "name": "BaseBdev4", 00:20:07.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.557 "is_configured": false, 00:20:07.557 "data_offset": 0, 00:20:07.557 "data_size": 0 00:20:07.557 } 00:20:07.557 ] 00:20:07.557 }' 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.557 22:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:08.491 [2024-07-15 22:48:53.268247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:08.491 [2024-07-15 22:48:53.268288] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22d5350 00:20:08.491 [2024-07-15 22:48:53.268297] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:08.491 [2024-07-15 22:48:53.268557] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d5020 00:20:08.491 [2024-07-15 22:48:53.268680] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22d5350 00:20:08.491 [2024-07-15 22:48:53.268690] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22d5350 00:20:08.491 [2024-07-15 22:48:53.268856] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:08.491 BaseBdev4 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:08.491 22:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:09.064 22:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:09.629 [ 00:20:09.629 { 00:20:09.629 "name": "BaseBdev4", 00:20:09.629 "aliases": [ 00:20:09.629 "b382562b-15b4-410e-8513-1396af1a32a1" 00:20:09.629 ], 00:20:09.629 "product_name": "Malloc disk", 00:20:09.629 "block_size": 512, 00:20:09.629 "num_blocks": 65536, 00:20:09.629 "uuid": "b382562b-15b4-410e-8513-1396af1a32a1", 00:20:09.629 "assigned_rate_limits": { 00:20:09.629 "rw_ios_per_sec": 0, 00:20:09.629 "rw_mbytes_per_sec": 0, 00:20:09.629 "r_mbytes_per_sec": 0, 00:20:09.629 "w_mbytes_per_sec": 0 00:20:09.629 }, 00:20:09.629 "claimed": true, 00:20:09.629 "claim_type": "exclusive_write", 00:20:09.629 "zoned": false, 00:20:09.629 "supported_io_types": { 00:20:09.629 "read": true, 00:20:09.629 "write": true, 00:20:09.629 "unmap": true, 00:20:09.629 "flush": true, 00:20:09.629 "reset": true, 00:20:09.629 "nvme_admin": false, 00:20:09.629 "nvme_io": false, 00:20:09.629 "nvme_io_md": false, 00:20:09.629 "write_zeroes": true, 00:20:09.629 "zcopy": true, 00:20:09.629 "get_zone_info": false, 00:20:09.629 "zone_management": false, 00:20:09.629 "zone_append": false, 00:20:09.629 "compare": false, 00:20:09.629 "compare_and_write": false, 00:20:09.629 "abort": true, 00:20:09.629 "seek_hole": false, 00:20:09.629 "seek_data": false, 00:20:09.629 "copy": true, 00:20:09.629 "nvme_iov_md": false 00:20:09.629 }, 00:20:09.629 "memory_domains": [ 00:20:09.629 { 00:20:09.629 "dma_device_id": "system", 00:20:09.629 "dma_device_type": 1 00:20:09.629 }, 00:20:09.629 { 00:20:09.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.629 "dma_device_type": 2 00:20:09.629 } 00:20:09.629 ], 00:20:09.629 "driver_specific": {} 00:20:09.629 } 00:20:09.629 ] 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.629 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:09.888 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.888 "name": "Existed_Raid", 00:20:09.888 "uuid": "5654adb5-01ab-4cee-a055-c52ee02cd113", 00:20:09.888 "strip_size_kb": 64, 00:20:09.888 "state": "online", 00:20:09.888 "raid_level": "concat", 00:20:09.888 "superblock": false, 00:20:09.888 "num_base_bdevs": 4, 00:20:09.888 "num_base_bdevs_discovered": 4, 00:20:09.888 "num_base_bdevs_operational": 4, 00:20:09.888 "base_bdevs_list": [ 00:20:09.888 { 00:20:09.888 "name": "BaseBdev1", 00:20:09.888 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:09.888 "is_configured": true, 00:20:09.888 "data_offset": 0, 00:20:09.888 "data_size": 65536 00:20:09.888 }, 00:20:09.888 { 00:20:09.888 "name": "BaseBdev2", 00:20:09.888 "uuid": "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d", 00:20:09.888 "is_configured": true, 00:20:09.888 "data_offset": 0, 00:20:09.888 "data_size": 65536 00:20:09.888 }, 00:20:09.888 { 00:20:09.888 "name": "BaseBdev3", 00:20:09.888 "uuid": "068f8034-fec7-48d1-8ddd-cb645ed84b10", 00:20:09.888 "is_configured": true, 00:20:09.888 "data_offset": 0, 00:20:09.888 "data_size": 65536 00:20:09.888 }, 00:20:09.888 { 00:20:09.888 "name": "BaseBdev4", 00:20:09.888 "uuid": "b382562b-15b4-410e-8513-1396af1a32a1", 00:20:09.888 "is_configured": true, 00:20:09.888 "data_offset": 0, 00:20:09.888 "data_size": 65536 00:20:09.888 } 00:20:09.888 ] 00:20:09.888 }' 00:20:09.888 22:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.888 22:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:10.455 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:10.714 [2024-07-15 22:48:55.382215] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:10.714 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:10.714 "name": "Existed_Raid", 00:20:10.714 "aliases": [ 00:20:10.714 "5654adb5-01ab-4cee-a055-c52ee02cd113" 00:20:10.714 ], 00:20:10.714 "product_name": "Raid Volume", 00:20:10.714 "block_size": 512, 00:20:10.714 "num_blocks": 262144, 00:20:10.714 "uuid": "5654adb5-01ab-4cee-a055-c52ee02cd113", 00:20:10.714 "assigned_rate_limits": { 00:20:10.714 "rw_ios_per_sec": 0, 00:20:10.714 "rw_mbytes_per_sec": 0, 00:20:10.714 "r_mbytes_per_sec": 0, 00:20:10.714 "w_mbytes_per_sec": 0 00:20:10.714 }, 00:20:10.714 "claimed": false, 00:20:10.714 "zoned": false, 00:20:10.714 "supported_io_types": { 00:20:10.714 "read": true, 00:20:10.714 "write": true, 00:20:10.714 "unmap": true, 00:20:10.714 "flush": true, 00:20:10.714 "reset": true, 00:20:10.714 "nvme_admin": false, 00:20:10.714 "nvme_io": false, 00:20:10.714 "nvme_io_md": false, 00:20:10.714 "write_zeroes": true, 00:20:10.714 "zcopy": false, 00:20:10.714 "get_zone_info": false, 00:20:10.714 "zone_management": false, 00:20:10.714 "zone_append": false, 00:20:10.714 "compare": false, 00:20:10.714 "compare_and_write": false, 00:20:10.714 "abort": false, 00:20:10.714 "seek_hole": false, 00:20:10.714 "seek_data": false, 00:20:10.714 "copy": false, 00:20:10.714 "nvme_iov_md": false 00:20:10.714 }, 00:20:10.714 "memory_domains": [ 00:20:10.714 { 00:20:10.714 "dma_device_id": "system", 00:20:10.714 "dma_device_type": 1 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.714 "dma_device_type": 2 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "dma_device_id": "system", 00:20:10.714 "dma_device_type": 1 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.714 "dma_device_type": 2 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "dma_device_id": "system", 00:20:10.714 "dma_device_type": 1 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.714 "dma_device_type": 2 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "dma_device_id": "system", 00:20:10.714 "dma_device_type": 1 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.714 "dma_device_type": 2 00:20:10.714 } 00:20:10.714 ], 00:20:10.714 "driver_specific": { 00:20:10.714 "raid": { 00:20:10.714 "uuid": "5654adb5-01ab-4cee-a055-c52ee02cd113", 00:20:10.714 "strip_size_kb": 64, 00:20:10.714 "state": "online", 00:20:10.714 "raid_level": "concat", 00:20:10.714 "superblock": false, 00:20:10.714 "num_base_bdevs": 4, 00:20:10.714 "num_base_bdevs_discovered": 4, 00:20:10.714 "num_base_bdevs_operational": 4, 00:20:10.714 "base_bdevs_list": [ 00:20:10.714 { 00:20:10.714 "name": "BaseBdev1", 00:20:10.714 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:10.714 "is_configured": true, 00:20:10.714 "data_offset": 0, 00:20:10.714 "data_size": 65536 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "name": "BaseBdev2", 00:20:10.714 "uuid": "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d", 00:20:10.714 "is_configured": true, 00:20:10.714 "data_offset": 0, 00:20:10.714 "data_size": 65536 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "name": "BaseBdev3", 00:20:10.714 "uuid": "068f8034-fec7-48d1-8ddd-cb645ed84b10", 00:20:10.714 "is_configured": true, 00:20:10.714 "data_offset": 0, 00:20:10.714 "data_size": 65536 00:20:10.714 }, 00:20:10.714 { 00:20:10.714 "name": "BaseBdev4", 00:20:10.714 "uuid": "b382562b-15b4-410e-8513-1396af1a32a1", 00:20:10.714 "is_configured": true, 00:20:10.714 "data_offset": 0, 00:20:10.714 "data_size": 65536 00:20:10.714 } 00:20:10.714 ] 00:20:10.714 } 00:20:10.714 } 00:20:10.714 }' 00:20:10.714 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:10.714 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:10.714 BaseBdev2 00:20:10.714 BaseBdev3 00:20:10.714 BaseBdev4' 00:20:10.714 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:10.714 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:10.714 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:10.973 "name": "BaseBdev1", 00:20:10.973 "aliases": [ 00:20:10.973 "230e04f3-27b7-4b54-8a4a-258e3b3872ec" 00:20:10.973 ], 00:20:10.973 "product_name": "Malloc disk", 00:20:10.973 "block_size": 512, 00:20:10.973 "num_blocks": 65536, 00:20:10.973 "uuid": "230e04f3-27b7-4b54-8a4a-258e3b3872ec", 00:20:10.973 "assigned_rate_limits": { 00:20:10.973 "rw_ios_per_sec": 0, 00:20:10.973 "rw_mbytes_per_sec": 0, 00:20:10.973 "r_mbytes_per_sec": 0, 00:20:10.973 "w_mbytes_per_sec": 0 00:20:10.973 }, 00:20:10.973 "claimed": true, 00:20:10.973 "claim_type": "exclusive_write", 00:20:10.973 "zoned": false, 00:20:10.973 "supported_io_types": { 00:20:10.973 "read": true, 00:20:10.973 "write": true, 00:20:10.973 "unmap": true, 00:20:10.973 "flush": true, 00:20:10.973 "reset": true, 00:20:10.973 "nvme_admin": false, 00:20:10.973 "nvme_io": false, 00:20:10.973 "nvme_io_md": false, 00:20:10.973 "write_zeroes": true, 00:20:10.973 "zcopy": true, 00:20:10.973 "get_zone_info": false, 00:20:10.973 "zone_management": false, 00:20:10.973 "zone_append": false, 00:20:10.973 "compare": false, 00:20:10.973 "compare_and_write": false, 00:20:10.973 "abort": true, 00:20:10.973 "seek_hole": false, 00:20:10.973 "seek_data": false, 00:20:10.973 "copy": true, 00:20:10.973 "nvme_iov_md": false 00:20:10.973 }, 00:20:10.973 "memory_domains": [ 00:20:10.973 { 00:20:10.973 "dma_device_id": "system", 00:20:10.973 "dma_device_type": 1 00:20:10.973 }, 00:20:10.973 { 00:20:10.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.973 "dma_device_type": 2 00:20:10.973 } 00:20:10.973 ], 00:20:10.973 "driver_specific": {} 00:20:10.973 }' 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.973 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.296 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.296 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:11.296 22:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.296 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.296 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:11.296 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:11.296 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:11.296 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:11.581 "name": "BaseBdev2", 00:20:11.581 "aliases": [ 00:20:11.581 "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d" 00:20:11.581 ], 00:20:11.581 "product_name": "Malloc disk", 00:20:11.581 "block_size": 512, 00:20:11.581 "num_blocks": 65536, 00:20:11.581 "uuid": "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d", 00:20:11.581 "assigned_rate_limits": { 00:20:11.581 "rw_ios_per_sec": 0, 00:20:11.581 "rw_mbytes_per_sec": 0, 00:20:11.581 "r_mbytes_per_sec": 0, 00:20:11.581 "w_mbytes_per_sec": 0 00:20:11.581 }, 00:20:11.581 "claimed": true, 00:20:11.581 "claim_type": "exclusive_write", 00:20:11.581 "zoned": false, 00:20:11.581 "supported_io_types": { 00:20:11.581 "read": true, 00:20:11.581 "write": true, 00:20:11.581 "unmap": true, 00:20:11.581 "flush": true, 00:20:11.581 "reset": true, 00:20:11.581 "nvme_admin": false, 00:20:11.581 "nvme_io": false, 00:20:11.581 "nvme_io_md": false, 00:20:11.581 "write_zeroes": true, 00:20:11.581 "zcopy": true, 00:20:11.581 "get_zone_info": false, 00:20:11.581 "zone_management": false, 00:20:11.581 "zone_append": false, 00:20:11.581 "compare": false, 00:20:11.581 "compare_and_write": false, 00:20:11.581 "abort": true, 00:20:11.581 "seek_hole": false, 00:20:11.581 "seek_data": false, 00:20:11.581 "copy": true, 00:20:11.581 "nvme_iov_md": false 00:20:11.581 }, 00:20:11.581 "memory_domains": [ 00:20:11.581 { 00:20:11.581 "dma_device_id": "system", 00:20:11.581 "dma_device_type": 1 00:20:11.581 }, 00:20:11.581 { 00:20:11.581 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.581 "dma_device_type": 2 00:20:11.581 } 00:20:11.581 ], 00:20:11.581 "driver_specific": {} 00:20:11.581 }' 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:11.581 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:11.839 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:12.098 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:12.098 "name": "BaseBdev3", 00:20:12.098 "aliases": [ 00:20:12.098 "068f8034-fec7-48d1-8ddd-cb645ed84b10" 00:20:12.098 ], 00:20:12.098 "product_name": "Malloc disk", 00:20:12.098 "block_size": 512, 00:20:12.098 "num_blocks": 65536, 00:20:12.098 "uuid": "068f8034-fec7-48d1-8ddd-cb645ed84b10", 00:20:12.098 "assigned_rate_limits": { 00:20:12.098 "rw_ios_per_sec": 0, 00:20:12.098 "rw_mbytes_per_sec": 0, 00:20:12.098 "r_mbytes_per_sec": 0, 00:20:12.098 "w_mbytes_per_sec": 0 00:20:12.098 }, 00:20:12.098 "claimed": true, 00:20:12.098 "claim_type": "exclusive_write", 00:20:12.098 "zoned": false, 00:20:12.098 "supported_io_types": { 00:20:12.098 "read": true, 00:20:12.098 "write": true, 00:20:12.098 "unmap": true, 00:20:12.098 "flush": true, 00:20:12.098 "reset": true, 00:20:12.098 "nvme_admin": false, 00:20:12.098 "nvme_io": false, 00:20:12.098 "nvme_io_md": false, 00:20:12.098 "write_zeroes": true, 00:20:12.098 "zcopy": true, 00:20:12.098 "get_zone_info": false, 00:20:12.098 "zone_management": false, 00:20:12.098 "zone_append": false, 00:20:12.098 "compare": false, 00:20:12.098 "compare_and_write": false, 00:20:12.098 "abort": true, 00:20:12.098 "seek_hole": false, 00:20:12.098 "seek_data": false, 00:20:12.098 "copy": true, 00:20:12.098 "nvme_iov_md": false 00:20:12.098 }, 00:20:12.098 "memory_domains": [ 00:20:12.098 { 00:20:12.098 "dma_device_id": "system", 00:20:12.098 "dma_device_type": 1 00:20:12.098 }, 00:20:12.098 { 00:20:12.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.098 "dma_device_type": 2 00:20:12.098 } 00:20:12.098 ], 00:20:12.098 "driver_specific": {} 00:20:12.098 }' 00:20:12.098 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.098 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.098 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:12.098 22:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.356 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.356 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:12.356 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.356 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.356 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:12.356 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.356 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.615 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:12.615 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:12.615 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:12.615 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:12.615 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:12.615 "name": "BaseBdev4", 00:20:12.615 "aliases": [ 00:20:12.615 "b382562b-15b4-410e-8513-1396af1a32a1" 00:20:12.615 ], 00:20:12.615 "product_name": "Malloc disk", 00:20:12.615 "block_size": 512, 00:20:12.615 "num_blocks": 65536, 00:20:12.615 "uuid": "b382562b-15b4-410e-8513-1396af1a32a1", 00:20:12.615 "assigned_rate_limits": { 00:20:12.615 "rw_ios_per_sec": 0, 00:20:12.615 "rw_mbytes_per_sec": 0, 00:20:12.615 "r_mbytes_per_sec": 0, 00:20:12.615 "w_mbytes_per_sec": 0 00:20:12.615 }, 00:20:12.615 "claimed": true, 00:20:12.615 "claim_type": "exclusive_write", 00:20:12.615 "zoned": false, 00:20:12.615 "supported_io_types": { 00:20:12.615 "read": true, 00:20:12.615 "write": true, 00:20:12.615 "unmap": true, 00:20:12.615 "flush": true, 00:20:12.615 "reset": true, 00:20:12.615 "nvme_admin": false, 00:20:12.615 "nvme_io": false, 00:20:12.615 "nvme_io_md": false, 00:20:12.615 "write_zeroes": true, 00:20:12.615 "zcopy": true, 00:20:12.615 "get_zone_info": false, 00:20:12.615 "zone_management": false, 00:20:12.615 "zone_append": false, 00:20:12.615 "compare": false, 00:20:12.615 "compare_and_write": false, 00:20:12.615 "abort": true, 00:20:12.615 "seek_hole": false, 00:20:12.615 "seek_data": false, 00:20:12.615 "copy": true, 00:20:12.615 "nvme_iov_md": false 00:20:12.615 }, 00:20:12.615 "memory_domains": [ 00:20:12.615 { 00:20:12.615 "dma_device_id": "system", 00:20:12.615 "dma_device_type": 1 00:20:12.615 }, 00:20:12.615 { 00:20:12.615 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.615 "dma_device_type": 2 00:20:12.615 } 00:20:12.615 ], 00:20:12.615 "driver_specific": {} 00:20:12.615 }' 00:20:12.615 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:12.874 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.133 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.134 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:13.134 22:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:13.393 [2024-07-15 22:48:58.109225] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:13.393 [2024-07-15 22:48:58.109259] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:13.393 [2024-07-15 22:48:58.109319] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.393 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:13.652 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.652 "name": "Existed_Raid", 00:20:13.652 "uuid": "5654adb5-01ab-4cee-a055-c52ee02cd113", 00:20:13.652 "strip_size_kb": 64, 00:20:13.652 "state": "offline", 00:20:13.652 "raid_level": "concat", 00:20:13.652 "superblock": false, 00:20:13.652 "num_base_bdevs": 4, 00:20:13.652 "num_base_bdevs_discovered": 3, 00:20:13.652 "num_base_bdevs_operational": 3, 00:20:13.652 "base_bdevs_list": [ 00:20:13.652 { 00:20:13.652 "name": null, 00:20:13.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.652 "is_configured": false, 00:20:13.652 "data_offset": 0, 00:20:13.652 "data_size": 65536 00:20:13.652 }, 00:20:13.652 { 00:20:13.652 "name": "BaseBdev2", 00:20:13.652 "uuid": "e6a1c4d3-7621-490e-a9e6-950a6ed07b7d", 00:20:13.652 "is_configured": true, 00:20:13.652 "data_offset": 0, 00:20:13.652 "data_size": 65536 00:20:13.652 }, 00:20:13.652 { 00:20:13.652 "name": "BaseBdev3", 00:20:13.652 "uuid": "068f8034-fec7-48d1-8ddd-cb645ed84b10", 00:20:13.652 "is_configured": true, 00:20:13.652 "data_offset": 0, 00:20:13.652 "data_size": 65536 00:20:13.652 }, 00:20:13.652 { 00:20:13.652 "name": "BaseBdev4", 00:20:13.652 "uuid": "b382562b-15b4-410e-8513-1396af1a32a1", 00:20:13.652 "is_configured": true, 00:20:13.652 "data_offset": 0, 00:20:13.652 "data_size": 65536 00:20:13.652 } 00:20:13.652 ] 00:20:13.652 }' 00:20:13.652 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.652 22:48:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:14.220 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:14.220 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:14.220 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.220 22:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:14.479 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:14.479 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:14.479 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:14.738 [2024-07-15 22:48:59.466680] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:14.738 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:14.738 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:14.738 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.738 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:14.997 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:14.997 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:14.997 22:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:15.256 [2024-07-15 22:48:59.986789] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:15.256 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:15.256 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:15.256 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.256 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:15.515 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:15.515 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:15.515 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:15.775 [2024-07-15 22:49:00.496574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:15.775 [2024-07-15 22:49:00.496617] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d5350 name Existed_Raid, state offline 00:20:15.775 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:15.775 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:15.775 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.775 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:16.034 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:16.034 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:16.034 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:16.034 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:16.034 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:16.034 22:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:16.293 BaseBdev2 00:20:16.293 22:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:16.293 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:16.293 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:16.293 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:16.293 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:16.293 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:16.293 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:16.552 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:16.812 [ 00:20:16.812 { 00:20:16.812 "name": "BaseBdev2", 00:20:16.812 "aliases": [ 00:20:16.812 "7d260ed9-6b5c-483a-8071-a43342890142" 00:20:16.812 ], 00:20:16.812 "product_name": "Malloc disk", 00:20:16.812 "block_size": 512, 00:20:16.812 "num_blocks": 65536, 00:20:16.812 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:16.812 "assigned_rate_limits": { 00:20:16.812 "rw_ios_per_sec": 0, 00:20:16.812 "rw_mbytes_per_sec": 0, 00:20:16.812 "r_mbytes_per_sec": 0, 00:20:16.812 "w_mbytes_per_sec": 0 00:20:16.812 }, 00:20:16.812 "claimed": false, 00:20:16.812 "zoned": false, 00:20:16.812 "supported_io_types": { 00:20:16.812 "read": true, 00:20:16.812 "write": true, 00:20:16.812 "unmap": true, 00:20:16.812 "flush": true, 00:20:16.812 "reset": true, 00:20:16.812 "nvme_admin": false, 00:20:16.812 "nvme_io": false, 00:20:16.812 "nvme_io_md": false, 00:20:16.812 "write_zeroes": true, 00:20:16.812 "zcopy": true, 00:20:16.812 "get_zone_info": false, 00:20:16.812 "zone_management": false, 00:20:16.812 "zone_append": false, 00:20:16.812 "compare": false, 00:20:16.812 "compare_and_write": false, 00:20:16.812 "abort": true, 00:20:16.812 "seek_hole": false, 00:20:16.812 "seek_data": false, 00:20:16.812 "copy": true, 00:20:16.812 "nvme_iov_md": false 00:20:16.812 }, 00:20:16.812 "memory_domains": [ 00:20:16.812 { 00:20:16.812 "dma_device_id": "system", 00:20:16.812 "dma_device_type": 1 00:20:16.812 }, 00:20:16.812 { 00:20:16.812 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.812 "dma_device_type": 2 00:20:16.812 } 00:20:16.812 ], 00:20:16.812 "driver_specific": {} 00:20:16.812 } 00:20:16.812 ] 00:20:16.812 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:16.812 22:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:16.812 22:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:16.812 22:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:17.071 BaseBdev3 00:20:17.071 22:49:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:17.071 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:17.071 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:17.071 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:17.071 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:17.071 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:17.071 22:49:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:17.330 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:17.590 [ 00:20:17.590 { 00:20:17.590 "name": "BaseBdev3", 00:20:17.590 "aliases": [ 00:20:17.590 "238fc6b8-dbd6-4e6a-8a7e-481d00b98236" 00:20:17.590 ], 00:20:17.590 "product_name": "Malloc disk", 00:20:17.590 "block_size": 512, 00:20:17.590 "num_blocks": 65536, 00:20:17.590 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:17.590 "assigned_rate_limits": { 00:20:17.590 "rw_ios_per_sec": 0, 00:20:17.590 "rw_mbytes_per_sec": 0, 00:20:17.590 "r_mbytes_per_sec": 0, 00:20:17.590 "w_mbytes_per_sec": 0 00:20:17.590 }, 00:20:17.590 "claimed": false, 00:20:17.590 "zoned": false, 00:20:17.590 "supported_io_types": { 00:20:17.590 "read": true, 00:20:17.590 "write": true, 00:20:17.590 "unmap": true, 00:20:17.590 "flush": true, 00:20:17.590 "reset": true, 00:20:17.590 "nvme_admin": false, 00:20:17.590 "nvme_io": false, 00:20:17.590 "nvme_io_md": false, 00:20:17.590 "write_zeroes": true, 00:20:17.590 "zcopy": true, 00:20:17.590 "get_zone_info": false, 00:20:17.590 "zone_management": false, 00:20:17.590 "zone_append": false, 00:20:17.590 "compare": false, 00:20:17.590 "compare_and_write": false, 00:20:17.590 "abort": true, 00:20:17.590 "seek_hole": false, 00:20:17.590 "seek_data": false, 00:20:17.590 "copy": true, 00:20:17.590 "nvme_iov_md": false 00:20:17.590 }, 00:20:17.590 "memory_domains": [ 00:20:17.590 { 00:20:17.590 "dma_device_id": "system", 00:20:17.590 "dma_device_type": 1 00:20:17.590 }, 00:20:17.590 { 00:20:17.590 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.590 "dma_device_type": 2 00:20:17.590 } 00:20:17.590 ], 00:20:17.590 "driver_specific": {} 00:20:17.590 } 00:20:17.590 ] 00:20:17.590 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:17.590 22:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:17.590 22:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:17.590 22:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:17.590 BaseBdev4 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:17.849 22:49:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:18.108 [ 00:20:18.108 { 00:20:18.108 "name": "BaseBdev4", 00:20:18.108 "aliases": [ 00:20:18.108 "2ea6c798-8660-43e6-9c35-f3f7676f81ab" 00:20:18.108 ], 00:20:18.108 "product_name": "Malloc disk", 00:20:18.108 "block_size": 512, 00:20:18.108 "num_blocks": 65536, 00:20:18.108 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:18.108 "assigned_rate_limits": { 00:20:18.109 "rw_ios_per_sec": 0, 00:20:18.109 "rw_mbytes_per_sec": 0, 00:20:18.109 "r_mbytes_per_sec": 0, 00:20:18.109 "w_mbytes_per_sec": 0 00:20:18.109 }, 00:20:18.109 "claimed": false, 00:20:18.109 "zoned": false, 00:20:18.109 "supported_io_types": { 00:20:18.109 "read": true, 00:20:18.109 "write": true, 00:20:18.109 "unmap": true, 00:20:18.109 "flush": true, 00:20:18.109 "reset": true, 00:20:18.109 "nvme_admin": false, 00:20:18.109 "nvme_io": false, 00:20:18.109 "nvme_io_md": false, 00:20:18.109 "write_zeroes": true, 00:20:18.109 "zcopy": true, 00:20:18.109 "get_zone_info": false, 00:20:18.109 "zone_management": false, 00:20:18.109 "zone_append": false, 00:20:18.109 "compare": false, 00:20:18.109 "compare_and_write": false, 00:20:18.109 "abort": true, 00:20:18.109 "seek_hole": false, 00:20:18.109 "seek_data": false, 00:20:18.109 "copy": true, 00:20:18.109 "nvme_iov_md": false 00:20:18.109 }, 00:20:18.109 "memory_domains": [ 00:20:18.109 { 00:20:18.109 "dma_device_id": "system", 00:20:18.109 "dma_device_type": 1 00:20:18.109 }, 00:20:18.109 { 00:20:18.109 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.109 "dma_device_type": 2 00:20:18.109 } 00:20:18.109 ], 00:20:18.109 "driver_specific": {} 00:20:18.109 } 00:20:18.109 ] 00:20:18.109 22:49:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:18.109 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:18.109 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:18.109 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:18.368 [2024-07-15 22:49:03.232984] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:18.368 [2024-07-15 22:49:03.233035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:18.368 [2024-07-15 22:49:03.233058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:18.368 [2024-07-15 22:49:03.234440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:18.368 [2024-07-15 22:49:03.234485] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.368 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:18.626 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.626 "name": "Existed_Raid", 00:20:18.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.626 "strip_size_kb": 64, 00:20:18.626 "state": "configuring", 00:20:18.626 "raid_level": "concat", 00:20:18.626 "superblock": false, 00:20:18.626 "num_base_bdevs": 4, 00:20:18.626 "num_base_bdevs_discovered": 3, 00:20:18.626 "num_base_bdevs_operational": 4, 00:20:18.626 "base_bdevs_list": [ 00:20:18.626 { 00:20:18.626 "name": "BaseBdev1", 00:20:18.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.626 "is_configured": false, 00:20:18.626 "data_offset": 0, 00:20:18.626 "data_size": 0 00:20:18.626 }, 00:20:18.626 { 00:20:18.626 "name": "BaseBdev2", 00:20:18.626 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:18.626 "is_configured": true, 00:20:18.626 "data_offset": 0, 00:20:18.626 "data_size": 65536 00:20:18.626 }, 00:20:18.626 { 00:20:18.626 "name": "BaseBdev3", 00:20:18.626 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:18.626 "is_configured": true, 00:20:18.626 "data_offset": 0, 00:20:18.626 "data_size": 65536 00:20:18.626 }, 00:20:18.626 { 00:20:18.626 "name": "BaseBdev4", 00:20:18.626 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:18.626 "is_configured": true, 00:20:18.626 "data_offset": 0, 00:20:18.626 "data_size": 65536 00:20:18.626 } 00:20:18.626 ] 00:20:18.626 }' 00:20:18.626 22:49:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.626 22:49:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:19.192 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:19.451 [2024-07-15 22:49:04.323854] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.451 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:19.710 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.710 "name": "Existed_Raid", 00:20:19.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.710 "strip_size_kb": 64, 00:20:19.710 "state": "configuring", 00:20:19.710 "raid_level": "concat", 00:20:19.710 "superblock": false, 00:20:19.710 "num_base_bdevs": 4, 00:20:19.710 "num_base_bdevs_discovered": 2, 00:20:19.710 "num_base_bdevs_operational": 4, 00:20:19.710 "base_bdevs_list": [ 00:20:19.710 { 00:20:19.710 "name": "BaseBdev1", 00:20:19.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.710 "is_configured": false, 00:20:19.710 "data_offset": 0, 00:20:19.710 "data_size": 0 00:20:19.710 }, 00:20:19.710 { 00:20:19.710 "name": null, 00:20:19.710 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:19.710 "is_configured": false, 00:20:19.710 "data_offset": 0, 00:20:19.711 "data_size": 65536 00:20:19.711 }, 00:20:19.711 { 00:20:19.711 "name": "BaseBdev3", 00:20:19.711 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:19.711 "is_configured": true, 00:20:19.711 "data_offset": 0, 00:20:19.711 "data_size": 65536 00:20:19.711 }, 00:20:19.711 { 00:20:19.711 "name": "BaseBdev4", 00:20:19.711 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:19.711 "is_configured": true, 00:20:19.711 "data_offset": 0, 00:20:19.711 "data_size": 65536 00:20:19.711 } 00:20:19.711 ] 00:20:19.711 }' 00:20:19.711 22:49:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.711 22:49:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.279 22:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.279 22:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:20.538 22:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:20.538 22:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:20.798 [2024-07-15 22:49:05.648010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:20.798 BaseBdev1 00:20:20.798 22:49:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:20.798 22:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:20.798 22:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:20.798 22:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:20.798 22:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:20.798 22:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:20.798 22:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:21.058 22:49:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:21.317 [ 00:20:21.317 { 00:20:21.317 "name": "BaseBdev1", 00:20:21.317 "aliases": [ 00:20:21.317 "28126cf3-f9c2-4fb8-9038-ab330c6ad944" 00:20:21.317 ], 00:20:21.317 "product_name": "Malloc disk", 00:20:21.317 "block_size": 512, 00:20:21.317 "num_blocks": 65536, 00:20:21.317 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:21.317 "assigned_rate_limits": { 00:20:21.317 "rw_ios_per_sec": 0, 00:20:21.317 "rw_mbytes_per_sec": 0, 00:20:21.317 "r_mbytes_per_sec": 0, 00:20:21.317 "w_mbytes_per_sec": 0 00:20:21.317 }, 00:20:21.317 "claimed": true, 00:20:21.317 "claim_type": "exclusive_write", 00:20:21.317 "zoned": false, 00:20:21.317 "supported_io_types": { 00:20:21.317 "read": true, 00:20:21.317 "write": true, 00:20:21.317 "unmap": true, 00:20:21.317 "flush": true, 00:20:21.317 "reset": true, 00:20:21.317 "nvme_admin": false, 00:20:21.317 "nvme_io": false, 00:20:21.317 "nvme_io_md": false, 00:20:21.317 "write_zeroes": true, 00:20:21.317 "zcopy": true, 00:20:21.317 "get_zone_info": false, 00:20:21.317 "zone_management": false, 00:20:21.317 "zone_append": false, 00:20:21.317 "compare": false, 00:20:21.317 "compare_and_write": false, 00:20:21.317 "abort": true, 00:20:21.317 "seek_hole": false, 00:20:21.317 "seek_data": false, 00:20:21.317 "copy": true, 00:20:21.317 "nvme_iov_md": false 00:20:21.317 }, 00:20:21.317 "memory_domains": [ 00:20:21.317 { 00:20:21.317 "dma_device_id": "system", 00:20:21.317 "dma_device_type": 1 00:20:21.317 }, 00:20:21.317 { 00:20:21.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.317 "dma_device_type": 2 00:20:21.317 } 00:20:21.317 ], 00:20:21.317 "driver_specific": {} 00:20:21.317 } 00:20:21.317 ] 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.317 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.318 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.318 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.318 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.318 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:21.577 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.577 "name": "Existed_Raid", 00:20:21.577 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:21.577 "strip_size_kb": 64, 00:20:21.577 "state": "configuring", 00:20:21.577 "raid_level": "concat", 00:20:21.577 "superblock": false, 00:20:21.577 "num_base_bdevs": 4, 00:20:21.577 "num_base_bdevs_discovered": 3, 00:20:21.577 "num_base_bdevs_operational": 4, 00:20:21.577 "base_bdevs_list": [ 00:20:21.577 { 00:20:21.577 "name": "BaseBdev1", 00:20:21.577 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:21.577 "is_configured": true, 00:20:21.577 "data_offset": 0, 00:20:21.577 "data_size": 65536 00:20:21.577 }, 00:20:21.577 { 00:20:21.577 "name": null, 00:20:21.577 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:21.577 "is_configured": false, 00:20:21.577 "data_offset": 0, 00:20:21.577 "data_size": 65536 00:20:21.577 }, 00:20:21.577 { 00:20:21.577 "name": "BaseBdev3", 00:20:21.577 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:21.577 "is_configured": true, 00:20:21.577 "data_offset": 0, 00:20:21.577 "data_size": 65536 00:20:21.577 }, 00:20:21.577 { 00:20:21.577 "name": "BaseBdev4", 00:20:21.577 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:21.577 "is_configured": true, 00:20:21.577 "data_offset": 0, 00:20:21.577 "data_size": 65536 00:20:21.577 } 00:20:21.577 ] 00:20:21.577 }' 00:20:21.577 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.577 22:49:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.144 22:49:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.144 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:22.403 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:22.403 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:22.662 [2024-07-15 22:49:07.476842] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.662 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:22.920 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.920 "name": "Existed_Raid", 00:20:22.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.920 "strip_size_kb": 64, 00:20:22.920 "state": "configuring", 00:20:22.920 "raid_level": "concat", 00:20:22.920 "superblock": false, 00:20:22.920 "num_base_bdevs": 4, 00:20:22.920 "num_base_bdevs_discovered": 2, 00:20:22.920 "num_base_bdevs_operational": 4, 00:20:22.920 "base_bdevs_list": [ 00:20:22.920 { 00:20:22.920 "name": "BaseBdev1", 00:20:22.920 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:22.920 "is_configured": true, 00:20:22.920 "data_offset": 0, 00:20:22.920 "data_size": 65536 00:20:22.920 }, 00:20:22.920 { 00:20:22.920 "name": null, 00:20:22.920 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:22.920 "is_configured": false, 00:20:22.920 "data_offset": 0, 00:20:22.920 "data_size": 65536 00:20:22.920 }, 00:20:22.920 { 00:20:22.920 "name": null, 00:20:22.920 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:22.920 "is_configured": false, 00:20:22.920 "data_offset": 0, 00:20:22.920 "data_size": 65536 00:20:22.920 }, 00:20:22.920 { 00:20:22.920 "name": "BaseBdev4", 00:20:22.920 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:22.920 "is_configured": true, 00:20:22.920 "data_offset": 0, 00:20:22.920 "data_size": 65536 00:20:22.920 } 00:20:22.920 ] 00:20:22.920 }' 00:20:22.920 22:49:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.920 22:49:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.487 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.487 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:23.746 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:23.746 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:24.005 [2024-07-15 22:49:08.832467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:24.005 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:24.005 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.006 22:49:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:24.264 22:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.264 "name": "Existed_Raid", 00:20:24.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.264 "strip_size_kb": 64, 00:20:24.264 "state": "configuring", 00:20:24.264 "raid_level": "concat", 00:20:24.265 "superblock": false, 00:20:24.265 "num_base_bdevs": 4, 00:20:24.265 "num_base_bdevs_discovered": 3, 00:20:24.265 "num_base_bdevs_operational": 4, 00:20:24.265 "base_bdevs_list": [ 00:20:24.265 { 00:20:24.265 "name": "BaseBdev1", 00:20:24.265 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:24.265 "is_configured": true, 00:20:24.265 "data_offset": 0, 00:20:24.265 "data_size": 65536 00:20:24.265 }, 00:20:24.265 { 00:20:24.265 "name": null, 00:20:24.265 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:24.265 "is_configured": false, 00:20:24.265 "data_offset": 0, 00:20:24.265 "data_size": 65536 00:20:24.265 }, 00:20:24.265 { 00:20:24.265 "name": "BaseBdev3", 00:20:24.265 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:24.265 "is_configured": true, 00:20:24.265 "data_offset": 0, 00:20:24.265 "data_size": 65536 00:20:24.265 }, 00:20:24.265 { 00:20:24.265 "name": "BaseBdev4", 00:20:24.265 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:24.265 "is_configured": true, 00:20:24.265 "data_offset": 0, 00:20:24.265 "data_size": 65536 00:20:24.265 } 00:20:24.265 ] 00:20:24.265 }' 00:20:24.265 22:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.265 22:49:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:24.832 22:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.832 22:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:25.090 22:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:25.090 22:49:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:25.349 [2024-07-15 22:49:10.172060] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.349 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.608 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.608 "name": "Existed_Raid", 00:20:25.608 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.608 "strip_size_kb": 64, 00:20:25.608 "state": "configuring", 00:20:25.608 "raid_level": "concat", 00:20:25.608 "superblock": false, 00:20:25.608 "num_base_bdevs": 4, 00:20:25.608 "num_base_bdevs_discovered": 2, 00:20:25.608 "num_base_bdevs_operational": 4, 00:20:25.608 "base_bdevs_list": [ 00:20:25.608 { 00:20:25.608 "name": null, 00:20:25.608 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:25.608 "is_configured": false, 00:20:25.608 "data_offset": 0, 00:20:25.608 "data_size": 65536 00:20:25.608 }, 00:20:25.608 { 00:20:25.608 "name": null, 00:20:25.608 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:25.608 "is_configured": false, 00:20:25.608 "data_offset": 0, 00:20:25.608 "data_size": 65536 00:20:25.608 }, 00:20:25.608 { 00:20:25.608 "name": "BaseBdev3", 00:20:25.608 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:25.608 "is_configured": true, 00:20:25.608 "data_offset": 0, 00:20:25.608 "data_size": 65536 00:20:25.608 }, 00:20:25.608 { 00:20:25.608 "name": "BaseBdev4", 00:20:25.608 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:25.608 "is_configured": true, 00:20:25.608 "data_offset": 0, 00:20:25.608 "data_size": 65536 00:20:25.608 } 00:20:25.608 ] 00:20:25.608 }' 00:20:25.608 22:49:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.608 22:49:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.176 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.176 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:26.435 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:26.435 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:26.694 [2024-07-15 22:49:11.522285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.694 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:26.953 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.953 "name": "Existed_Raid", 00:20:26.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.953 "strip_size_kb": 64, 00:20:26.953 "state": "configuring", 00:20:26.953 "raid_level": "concat", 00:20:26.953 "superblock": false, 00:20:26.953 "num_base_bdevs": 4, 00:20:26.953 "num_base_bdevs_discovered": 3, 00:20:26.953 "num_base_bdevs_operational": 4, 00:20:26.953 "base_bdevs_list": [ 00:20:26.953 { 00:20:26.953 "name": null, 00:20:26.953 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:26.953 "is_configured": false, 00:20:26.953 "data_offset": 0, 00:20:26.953 "data_size": 65536 00:20:26.954 }, 00:20:26.954 { 00:20:26.954 "name": "BaseBdev2", 00:20:26.954 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:26.954 "is_configured": true, 00:20:26.954 "data_offset": 0, 00:20:26.954 "data_size": 65536 00:20:26.954 }, 00:20:26.954 { 00:20:26.954 "name": "BaseBdev3", 00:20:26.954 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:26.954 "is_configured": true, 00:20:26.954 "data_offset": 0, 00:20:26.954 "data_size": 65536 00:20:26.954 }, 00:20:26.954 { 00:20:26.954 "name": "BaseBdev4", 00:20:26.954 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:26.954 "is_configured": true, 00:20:26.954 "data_offset": 0, 00:20:26.954 "data_size": 65536 00:20:26.954 } 00:20:26.954 ] 00:20:26.954 }' 00:20:26.954 22:49:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.954 22:49:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.520 22:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.520 22:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:27.790 22:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:27.790 22:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.790 22:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:28.074 22:49:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 28126cf3-f9c2-4fb8-9038-ab330c6ad944 00:20:28.639 [2024-07-15 22:49:13.395886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:28.639 [2024-07-15 22:49:13.395950] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22d9040 00:20:28.639 [2024-07-15 22:49:13.395965] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:28.639 [2024-07-15 22:49:13.396173] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22d4a70 00:20:28.639 [2024-07-15 22:49:13.396300] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22d9040 00:20:28.639 [2024-07-15 22:49:13.396310] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22d9040 00:20:28.639 [2024-07-15 22:49:13.396480] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:28.639 NewBaseBdev 00:20:28.639 22:49:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:28.639 22:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:28.639 22:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:28.639 22:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:28.639 22:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:28.639 22:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:28.639 22:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:28.896 22:49:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:29.461 [ 00:20:29.461 { 00:20:29.461 "name": "NewBaseBdev", 00:20:29.461 "aliases": [ 00:20:29.461 "28126cf3-f9c2-4fb8-9038-ab330c6ad944" 00:20:29.461 ], 00:20:29.461 "product_name": "Malloc disk", 00:20:29.461 "block_size": 512, 00:20:29.461 "num_blocks": 65536, 00:20:29.461 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:29.461 "assigned_rate_limits": { 00:20:29.461 "rw_ios_per_sec": 0, 00:20:29.461 "rw_mbytes_per_sec": 0, 00:20:29.461 "r_mbytes_per_sec": 0, 00:20:29.461 "w_mbytes_per_sec": 0 00:20:29.461 }, 00:20:29.461 "claimed": true, 00:20:29.461 "claim_type": "exclusive_write", 00:20:29.461 "zoned": false, 00:20:29.461 "supported_io_types": { 00:20:29.461 "read": true, 00:20:29.461 "write": true, 00:20:29.461 "unmap": true, 00:20:29.461 "flush": true, 00:20:29.461 "reset": true, 00:20:29.461 "nvme_admin": false, 00:20:29.461 "nvme_io": false, 00:20:29.461 "nvme_io_md": false, 00:20:29.461 "write_zeroes": true, 00:20:29.461 "zcopy": true, 00:20:29.461 "get_zone_info": false, 00:20:29.461 "zone_management": false, 00:20:29.461 "zone_append": false, 00:20:29.461 "compare": false, 00:20:29.461 "compare_and_write": false, 00:20:29.461 "abort": true, 00:20:29.461 "seek_hole": false, 00:20:29.461 "seek_data": false, 00:20:29.461 "copy": true, 00:20:29.461 "nvme_iov_md": false 00:20:29.461 }, 00:20:29.461 "memory_domains": [ 00:20:29.461 { 00:20:29.461 "dma_device_id": "system", 00:20:29.461 "dma_device_type": 1 00:20:29.461 }, 00:20:29.461 { 00:20:29.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.461 "dma_device_type": 2 00:20:29.461 } 00:20:29.461 ], 00:20:29.461 "driver_specific": {} 00:20:29.461 } 00:20:29.461 ] 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.461 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.719 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.719 "name": "Existed_Raid", 00:20:29.719 "uuid": "9407d9de-4429-4228-a33f-527b961f3e64", 00:20:29.719 "strip_size_kb": 64, 00:20:29.719 "state": "online", 00:20:29.719 "raid_level": "concat", 00:20:29.719 "superblock": false, 00:20:29.719 "num_base_bdevs": 4, 00:20:29.719 "num_base_bdevs_discovered": 4, 00:20:29.719 "num_base_bdevs_operational": 4, 00:20:29.719 "base_bdevs_list": [ 00:20:29.719 { 00:20:29.719 "name": "NewBaseBdev", 00:20:29.719 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:29.719 "is_configured": true, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 }, 00:20:29.719 { 00:20:29.719 "name": "BaseBdev2", 00:20:29.719 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:29.719 "is_configured": true, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 }, 00:20:29.719 { 00:20:29.719 "name": "BaseBdev3", 00:20:29.719 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:29.719 "is_configured": true, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 }, 00:20:29.719 { 00:20:29.719 "name": "BaseBdev4", 00:20:29.719 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:29.719 "is_configured": true, 00:20:29.719 "data_offset": 0, 00:20:29.719 "data_size": 65536 00:20:29.719 } 00:20:29.719 ] 00:20:29.719 }' 00:20:29.719 22:49:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.719 22:49:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:30.284 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:30.542 [2024-07-15 22:49:15.269214] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:30.542 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:30.542 "name": "Existed_Raid", 00:20:30.542 "aliases": [ 00:20:30.542 "9407d9de-4429-4228-a33f-527b961f3e64" 00:20:30.542 ], 00:20:30.542 "product_name": "Raid Volume", 00:20:30.542 "block_size": 512, 00:20:30.542 "num_blocks": 262144, 00:20:30.542 "uuid": "9407d9de-4429-4228-a33f-527b961f3e64", 00:20:30.542 "assigned_rate_limits": { 00:20:30.542 "rw_ios_per_sec": 0, 00:20:30.542 "rw_mbytes_per_sec": 0, 00:20:30.542 "r_mbytes_per_sec": 0, 00:20:30.542 "w_mbytes_per_sec": 0 00:20:30.542 }, 00:20:30.542 "claimed": false, 00:20:30.542 "zoned": false, 00:20:30.542 "supported_io_types": { 00:20:30.542 "read": true, 00:20:30.542 "write": true, 00:20:30.542 "unmap": true, 00:20:30.542 "flush": true, 00:20:30.542 "reset": true, 00:20:30.542 "nvme_admin": false, 00:20:30.542 "nvme_io": false, 00:20:30.542 "nvme_io_md": false, 00:20:30.542 "write_zeroes": true, 00:20:30.542 "zcopy": false, 00:20:30.542 "get_zone_info": false, 00:20:30.542 "zone_management": false, 00:20:30.542 "zone_append": false, 00:20:30.542 "compare": false, 00:20:30.542 "compare_and_write": false, 00:20:30.542 "abort": false, 00:20:30.542 "seek_hole": false, 00:20:30.542 "seek_data": false, 00:20:30.542 "copy": false, 00:20:30.542 "nvme_iov_md": false 00:20:30.542 }, 00:20:30.542 "memory_domains": [ 00:20:30.542 { 00:20:30.542 "dma_device_id": "system", 00:20:30.542 "dma_device_type": 1 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.542 "dma_device_type": 2 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "dma_device_id": "system", 00:20:30.542 "dma_device_type": 1 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.542 "dma_device_type": 2 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "dma_device_id": "system", 00:20:30.542 "dma_device_type": 1 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.542 "dma_device_type": 2 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "dma_device_id": "system", 00:20:30.542 "dma_device_type": 1 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.542 "dma_device_type": 2 00:20:30.542 } 00:20:30.542 ], 00:20:30.542 "driver_specific": { 00:20:30.542 "raid": { 00:20:30.542 "uuid": "9407d9de-4429-4228-a33f-527b961f3e64", 00:20:30.542 "strip_size_kb": 64, 00:20:30.542 "state": "online", 00:20:30.542 "raid_level": "concat", 00:20:30.542 "superblock": false, 00:20:30.542 "num_base_bdevs": 4, 00:20:30.542 "num_base_bdevs_discovered": 4, 00:20:30.542 "num_base_bdevs_operational": 4, 00:20:30.542 "base_bdevs_list": [ 00:20:30.542 { 00:20:30.542 "name": "NewBaseBdev", 00:20:30.542 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:30.542 "is_configured": true, 00:20:30.542 "data_offset": 0, 00:20:30.542 "data_size": 65536 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "name": "BaseBdev2", 00:20:30.542 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:30.542 "is_configured": true, 00:20:30.542 "data_offset": 0, 00:20:30.542 "data_size": 65536 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "name": "BaseBdev3", 00:20:30.542 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:30.542 "is_configured": true, 00:20:30.542 "data_offset": 0, 00:20:30.542 "data_size": 65536 00:20:30.542 }, 00:20:30.542 { 00:20:30.542 "name": "BaseBdev4", 00:20:30.542 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:30.542 "is_configured": true, 00:20:30.542 "data_offset": 0, 00:20:30.542 "data_size": 65536 00:20:30.542 } 00:20:30.542 ] 00:20:30.542 } 00:20:30.542 } 00:20:30.542 }' 00:20:30.542 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:30.542 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:30.542 BaseBdev2 00:20:30.542 BaseBdev3 00:20:30.542 BaseBdev4' 00:20:30.542 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:30.542 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:30.542 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:30.802 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:30.802 "name": "NewBaseBdev", 00:20:30.802 "aliases": [ 00:20:30.802 "28126cf3-f9c2-4fb8-9038-ab330c6ad944" 00:20:30.802 ], 00:20:30.802 "product_name": "Malloc disk", 00:20:30.802 "block_size": 512, 00:20:30.802 "num_blocks": 65536, 00:20:30.802 "uuid": "28126cf3-f9c2-4fb8-9038-ab330c6ad944", 00:20:30.802 "assigned_rate_limits": { 00:20:30.802 "rw_ios_per_sec": 0, 00:20:30.802 "rw_mbytes_per_sec": 0, 00:20:30.802 "r_mbytes_per_sec": 0, 00:20:30.802 "w_mbytes_per_sec": 0 00:20:30.802 }, 00:20:30.802 "claimed": true, 00:20:30.802 "claim_type": "exclusive_write", 00:20:30.802 "zoned": false, 00:20:30.802 "supported_io_types": { 00:20:30.802 "read": true, 00:20:30.802 "write": true, 00:20:30.802 "unmap": true, 00:20:30.802 "flush": true, 00:20:30.802 "reset": true, 00:20:30.802 "nvme_admin": false, 00:20:30.802 "nvme_io": false, 00:20:30.802 "nvme_io_md": false, 00:20:30.802 "write_zeroes": true, 00:20:30.802 "zcopy": true, 00:20:30.802 "get_zone_info": false, 00:20:30.802 "zone_management": false, 00:20:30.802 "zone_append": false, 00:20:30.802 "compare": false, 00:20:30.802 "compare_and_write": false, 00:20:30.802 "abort": true, 00:20:30.802 "seek_hole": false, 00:20:30.802 "seek_data": false, 00:20:30.802 "copy": true, 00:20:30.802 "nvme_iov_md": false 00:20:30.802 }, 00:20:30.802 "memory_domains": [ 00:20:30.802 { 00:20:30.802 "dma_device_id": "system", 00:20:30.802 "dma_device_type": 1 00:20:30.802 }, 00:20:30.802 { 00:20:30.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.802 "dma_device_type": 2 00:20:30.802 } 00:20:30.802 ], 00:20:30.802 "driver_specific": {} 00:20:30.802 }' 00:20:30.802 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.802 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.802 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:30.802 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:31.061 22:49:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.320 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.320 "name": "BaseBdev2", 00:20:31.320 "aliases": [ 00:20:31.320 "7d260ed9-6b5c-483a-8071-a43342890142" 00:20:31.320 ], 00:20:31.320 "product_name": "Malloc disk", 00:20:31.320 "block_size": 512, 00:20:31.320 "num_blocks": 65536, 00:20:31.320 "uuid": "7d260ed9-6b5c-483a-8071-a43342890142", 00:20:31.320 "assigned_rate_limits": { 00:20:31.320 "rw_ios_per_sec": 0, 00:20:31.320 "rw_mbytes_per_sec": 0, 00:20:31.320 "r_mbytes_per_sec": 0, 00:20:31.320 "w_mbytes_per_sec": 0 00:20:31.320 }, 00:20:31.320 "claimed": true, 00:20:31.320 "claim_type": "exclusive_write", 00:20:31.320 "zoned": false, 00:20:31.320 "supported_io_types": { 00:20:31.320 "read": true, 00:20:31.320 "write": true, 00:20:31.320 "unmap": true, 00:20:31.320 "flush": true, 00:20:31.320 "reset": true, 00:20:31.320 "nvme_admin": false, 00:20:31.320 "nvme_io": false, 00:20:31.320 "nvme_io_md": false, 00:20:31.320 "write_zeroes": true, 00:20:31.320 "zcopy": true, 00:20:31.320 "get_zone_info": false, 00:20:31.320 "zone_management": false, 00:20:31.320 "zone_append": false, 00:20:31.320 "compare": false, 00:20:31.320 "compare_and_write": false, 00:20:31.320 "abort": true, 00:20:31.320 "seek_hole": false, 00:20:31.320 "seek_data": false, 00:20:31.320 "copy": true, 00:20:31.320 "nvme_iov_md": false 00:20:31.320 }, 00:20:31.320 "memory_domains": [ 00:20:31.320 { 00:20:31.320 "dma_device_id": "system", 00:20:31.320 "dma_device_type": 1 00:20:31.320 }, 00:20:31.320 { 00:20:31.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.320 "dma_device_type": 2 00:20:31.320 } 00:20:31.320 ], 00:20:31.320 "driver_specific": {} 00:20:31.320 }' 00:20:31.320 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.578 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.836 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.836 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.836 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.836 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:31.836 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.095 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.095 "name": "BaseBdev3", 00:20:32.095 "aliases": [ 00:20:32.095 "238fc6b8-dbd6-4e6a-8a7e-481d00b98236" 00:20:32.095 ], 00:20:32.095 "product_name": "Malloc disk", 00:20:32.095 "block_size": 512, 00:20:32.095 "num_blocks": 65536, 00:20:32.095 "uuid": "238fc6b8-dbd6-4e6a-8a7e-481d00b98236", 00:20:32.095 "assigned_rate_limits": { 00:20:32.095 "rw_ios_per_sec": 0, 00:20:32.095 "rw_mbytes_per_sec": 0, 00:20:32.095 "r_mbytes_per_sec": 0, 00:20:32.095 "w_mbytes_per_sec": 0 00:20:32.095 }, 00:20:32.095 "claimed": true, 00:20:32.095 "claim_type": "exclusive_write", 00:20:32.095 "zoned": false, 00:20:32.095 "supported_io_types": { 00:20:32.095 "read": true, 00:20:32.095 "write": true, 00:20:32.095 "unmap": true, 00:20:32.095 "flush": true, 00:20:32.095 "reset": true, 00:20:32.095 "nvme_admin": false, 00:20:32.095 "nvme_io": false, 00:20:32.095 "nvme_io_md": false, 00:20:32.095 "write_zeroes": true, 00:20:32.095 "zcopy": true, 00:20:32.096 "get_zone_info": false, 00:20:32.096 "zone_management": false, 00:20:32.096 "zone_append": false, 00:20:32.096 "compare": false, 00:20:32.096 "compare_and_write": false, 00:20:32.096 "abort": true, 00:20:32.096 "seek_hole": false, 00:20:32.096 "seek_data": false, 00:20:32.096 "copy": true, 00:20:32.096 "nvme_iov_md": false 00:20:32.096 }, 00:20:32.096 "memory_domains": [ 00:20:32.096 { 00:20:32.096 "dma_device_id": "system", 00:20:32.096 "dma_device_type": 1 00:20:32.096 }, 00:20:32.096 { 00:20:32.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.096 "dma_device_type": 2 00:20:32.096 } 00:20:32.096 ], 00:20:32.096 "driver_specific": {} 00:20:32.096 }' 00:20:32.096 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.096 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.096 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.096 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.096 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.096 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.096 22:49:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:32.355 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.613 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.613 "name": "BaseBdev4", 00:20:32.613 "aliases": [ 00:20:32.613 "2ea6c798-8660-43e6-9c35-f3f7676f81ab" 00:20:32.613 ], 00:20:32.613 "product_name": "Malloc disk", 00:20:32.613 "block_size": 512, 00:20:32.613 "num_blocks": 65536, 00:20:32.613 "uuid": "2ea6c798-8660-43e6-9c35-f3f7676f81ab", 00:20:32.613 "assigned_rate_limits": { 00:20:32.613 "rw_ios_per_sec": 0, 00:20:32.613 "rw_mbytes_per_sec": 0, 00:20:32.613 "r_mbytes_per_sec": 0, 00:20:32.613 "w_mbytes_per_sec": 0 00:20:32.613 }, 00:20:32.613 "claimed": true, 00:20:32.613 "claim_type": "exclusive_write", 00:20:32.613 "zoned": false, 00:20:32.613 "supported_io_types": { 00:20:32.613 "read": true, 00:20:32.613 "write": true, 00:20:32.613 "unmap": true, 00:20:32.613 "flush": true, 00:20:32.613 "reset": true, 00:20:32.613 "nvme_admin": false, 00:20:32.613 "nvme_io": false, 00:20:32.613 "nvme_io_md": false, 00:20:32.613 "write_zeroes": true, 00:20:32.613 "zcopy": true, 00:20:32.613 "get_zone_info": false, 00:20:32.613 "zone_management": false, 00:20:32.613 "zone_append": false, 00:20:32.613 "compare": false, 00:20:32.613 "compare_and_write": false, 00:20:32.613 "abort": true, 00:20:32.613 "seek_hole": false, 00:20:32.613 "seek_data": false, 00:20:32.613 "copy": true, 00:20:32.613 "nvme_iov_md": false 00:20:32.613 }, 00:20:32.613 "memory_domains": [ 00:20:32.613 { 00:20:32.613 "dma_device_id": "system", 00:20:32.613 "dma_device_type": 1 00:20:32.613 }, 00:20:32.613 { 00:20:32.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.613 "dma_device_type": 2 00:20:32.613 } 00:20:32.613 ], 00:20:32.613 "driver_specific": {} 00:20:32.613 }' 00:20:32.613 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.613 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.613 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.613 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.613 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.872 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:33.130 [2024-07-15 22:49:17.956039] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:33.131 [2024-07-15 22:49:17.956071] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:33.131 [2024-07-15 22:49:17.956134] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:33.131 [2024-07-15 22:49:17.956195] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:33.131 [2024-07-15 22:49:17.956207] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22d9040 name Existed_Raid, state offline 00:20:33.131 22:49:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2776263 00:20:33.131 22:49:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2776263 ']' 00:20:33.131 22:49:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2776263 00:20:33.131 22:49:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:33.131 22:49:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:33.131 22:49:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2776263 00:20:33.131 22:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:33.131 22:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:33.131 22:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2776263' 00:20:33.131 killing process with pid 2776263 00:20:33.131 22:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2776263 00:20:33.131 [2024-07-15 22:49:18.024697] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:33.131 22:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2776263 00:20:33.389 [2024-07-15 22:49:18.068292] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:33.389 22:49:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:33.389 00:20:33.389 real 0m34.553s 00:20:33.389 user 1m3.468s 00:20:33.389 sys 0m6.107s 00:20:33.389 22:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:33.389 22:49:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.389 ************************************ 00:20:33.389 END TEST raid_state_function_test 00:20:33.389 ************************************ 00:20:33.649 22:49:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:33.649 22:49:18 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:20:33.649 22:49:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:33.649 22:49:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:33.649 22:49:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:33.649 ************************************ 00:20:33.649 START TEST raid_state_function_test_sb 00:20:33.649 ************************************ 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2781316 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2781316' 00:20:33.649 Process raid pid: 2781316 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2781316 /var/tmp/spdk-raid.sock 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2781316 ']' 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:33.649 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:33.649 22:49:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:33.649 [2024-07-15 22:49:18.451650] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:20:33.649 [2024-07-15 22:49:18.451705] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:33.907 [2024-07-15 22:49:18.564670] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:33.907 [2024-07-15 22:49:18.671206] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:33.907 [2024-07-15 22:49:18.733421] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.907 [2024-07-15 22:49:18.733449] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:34.473 22:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:34.473 22:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:34.473 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:34.732 [2024-07-15 22:49:19.551864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:34.732 [2024-07-15 22:49:19.551908] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:34.732 [2024-07-15 22:49:19.551918] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:34.732 [2024-07-15 22:49:19.551936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:34.732 [2024-07-15 22:49:19.551945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:34.732 [2024-07-15 22:49:19.551956] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:34.732 [2024-07-15 22:49:19.551965] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:34.732 [2024-07-15 22:49:19.551976] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.732 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:34.991 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.991 "name": "Existed_Raid", 00:20:34.991 "uuid": "5486b3d9-0327-4047-8762-7167c9d3e850", 00:20:34.991 "strip_size_kb": 64, 00:20:34.991 "state": "configuring", 00:20:34.991 "raid_level": "concat", 00:20:34.991 "superblock": true, 00:20:34.991 "num_base_bdevs": 4, 00:20:34.991 "num_base_bdevs_discovered": 0, 00:20:34.991 "num_base_bdevs_operational": 4, 00:20:34.991 "base_bdevs_list": [ 00:20:34.991 { 00:20:34.991 "name": "BaseBdev1", 00:20:34.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.991 "is_configured": false, 00:20:34.991 "data_offset": 0, 00:20:34.991 "data_size": 0 00:20:34.991 }, 00:20:34.991 { 00:20:34.991 "name": "BaseBdev2", 00:20:34.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.991 "is_configured": false, 00:20:34.991 "data_offset": 0, 00:20:34.991 "data_size": 0 00:20:34.991 }, 00:20:34.991 { 00:20:34.991 "name": "BaseBdev3", 00:20:34.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.991 "is_configured": false, 00:20:34.991 "data_offset": 0, 00:20:34.991 "data_size": 0 00:20:34.991 }, 00:20:34.991 { 00:20:34.991 "name": "BaseBdev4", 00:20:34.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.991 "is_configured": false, 00:20:34.991 "data_offset": 0, 00:20:34.991 "data_size": 0 00:20:34.991 } 00:20:34.991 ] 00:20:34.991 }' 00:20:34.991 22:49:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.991 22:49:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.560 22:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:35.818 [2024-07-15 22:49:20.662638] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:35.818 [2024-07-15 22:49:20.662672] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f0aa0 name Existed_Raid, state configuring 00:20:35.818 22:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:36.076 [2024-07-15 22:49:20.911327] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:36.076 [2024-07-15 22:49:20.911359] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:36.076 [2024-07-15 22:49:20.911369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:36.076 [2024-07-15 22:49:20.911381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:36.077 [2024-07-15 22:49:20.911389] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:36.077 [2024-07-15 22:49:20.911400] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:36.077 [2024-07-15 22:49:20.911409] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:36.077 [2024-07-15 22:49:20.911420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:36.077 22:49:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:36.336 [2024-07-15 22:49:21.169872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:36.336 BaseBdev1 00:20:36.336 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:36.336 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:36.336 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:36.336 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:36.336 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:36.336 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:36.336 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:36.594 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:36.854 [ 00:20:36.854 { 00:20:36.854 "name": "BaseBdev1", 00:20:36.854 "aliases": [ 00:20:36.854 "55ff1e9d-f288-4055-9c57-a0a24ac9c447" 00:20:36.854 ], 00:20:36.854 "product_name": "Malloc disk", 00:20:36.854 "block_size": 512, 00:20:36.854 "num_blocks": 65536, 00:20:36.854 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:36.854 "assigned_rate_limits": { 00:20:36.854 "rw_ios_per_sec": 0, 00:20:36.854 "rw_mbytes_per_sec": 0, 00:20:36.854 "r_mbytes_per_sec": 0, 00:20:36.854 "w_mbytes_per_sec": 0 00:20:36.854 }, 00:20:36.854 "claimed": true, 00:20:36.854 "claim_type": "exclusive_write", 00:20:36.854 "zoned": false, 00:20:36.854 "supported_io_types": { 00:20:36.854 "read": true, 00:20:36.854 "write": true, 00:20:36.854 "unmap": true, 00:20:36.854 "flush": true, 00:20:36.854 "reset": true, 00:20:36.854 "nvme_admin": false, 00:20:36.854 "nvme_io": false, 00:20:36.854 "nvme_io_md": false, 00:20:36.854 "write_zeroes": true, 00:20:36.854 "zcopy": true, 00:20:36.854 "get_zone_info": false, 00:20:36.854 "zone_management": false, 00:20:36.854 "zone_append": false, 00:20:36.854 "compare": false, 00:20:36.854 "compare_and_write": false, 00:20:36.854 "abort": true, 00:20:36.854 "seek_hole": false, 00:20:36.854 "seek_data": false, 00:20:36.854 "copy": true, 00:20:36.854 "nvme_iov_md": false 00:20:36.854 }, 00:20:36.854 "memory_domains": [ 00:20:36.854 { 00:20:36.854 "dma_device_id": "system", 00:20:36.854 "dma_device_type": 1 00:20:36.854 }, 00:20:36.854 { 00:20:36.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.854 "dma_device_type": 2 00:20:36.854 } 00:20:36.854 ], 00:20:36.854 "driver_specific": {} 00:20:36.854 } 00:20:36.854 ] 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.854 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:37.112 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.112 "name": "Existed_Raid", 00:20:37.112 "uuid": "2a3fc67f-6e79-456c-814e-d4c7e4eaf31a", 00:20:37.112 "strip_size_kb": 64, 00:20:37.112 "state": "configuring", 00:20:37.112 "raid_level": "concat", 00:20:37.112 "superblock": true, 00:20:37.112 "num_base_bdevs": 4, 00:20:37.112 "num_base_bdevs_discovered": 1, 00:20:37.112 "num_base_bdevs_operational": 4, 00:20:37.112 "base_bdevs_list": [ 00:20:37.112 { 00:20:37.112 "name": "BaseBdev1", 00:20:37.112 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:37.112 "is_configured": true, 00:20:37.112 "data_offset": 2048, 00:20:37.112 "data_size": 63488 00:20:37.112 }, 00:20:37.112 { 00:20:37.112 "name": "BaseBdev2", 00:20:37.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.112 "is_configured": false, 00:20:37.112 "data_offset": 0, 00:20:37.112 "data_size": 0 00:20:37.112 }, 00:20:37.112 { 00:20:37.112 "name": "BaseBdev3", 00:20:37.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.112 "is_configured": false, 00:20:37.112 "data_offset": 0, 00:20:37.112 "data_size": 0 00:20:37.112 }, 00:20:37.112 { 00:20:37.112 "name": "BaseBdev4", 00:20:37.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.112 "is_configured": false, 00:20:37.112 "data_offset": 0, 00:20:37.112 "data_size": 0 00:20:37.112 } 00:20:37.112 ] 00:20:37.112 }' 00:20:37.112 22:49:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.112 22:49:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:37.674 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:37.931 [2024-07-15 22:49:22.669904] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:37.931 [2024-07-15 22:49:22.669953] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f0310 name Existed_Raid, state configuring 00:20:37.931 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:38.188 [2024-07-15 22:49:22.846426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:38.188 [2024-07-15 22:49:22.847889] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:38.188 [2024-07-15 22:49:22.847921] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:38.188 [2024-07-15 22:49:22.847943] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:38.188 [2024-07-15 22:49:22.847956] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:38.188 [2024-07-15 22:49:22.847965] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:38.188 [2024-07-15 22:49:22.847976] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.188 22:49:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:38.188 22:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.188 "name": "Existed_Raid", 00:20:38.188 "uuid": "3817182f-1d23-4ac3-ae67-02643128bf11", 00:20:38.188 "strip_size_kb": 64, 00:20:38.188 "state": "configuring", 00:20:38.188 "raid_level": "concat", 00:20:38.188 "superblock": true, 00:20:38.188 "num_base_bdevs": 4, 00:20:38.188 "num_base_bdevs_discovered": 1, 00:20:38.188 "num_base_bdevs_operational": 4, 00:20:38.188 "base_bdevs_list": [ 00:20:38.188 { 00:20:38.188 "name": "BaseBdev1", 00:20:38.188 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:38.188 "is_configured": true, 00:20:38.188 "data_offset": 2048, 00:20:38.188 "data_size": 63488 00:20:38.188 }, 00:20:38.188 { 00:20:38.188 "name": "BaseBdev2", 00:20:38.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.188 "is_configured": false, 00:20:38.188 "data_offset": 0, 00:20:38.188 "data_size": 0 00:20:38.188 }, 00:20:38.188 { 00:20:38.188 "name": "BaseBdev3", 00:20:38.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.188 "is_configured": false, 00:20:38.188 "data_offset": 0, 00:20:38.188 "data_size": 0 00:20:38.188 }, 00:20:38.188 { 00:20:38.188 "name": "BaseBdev4", 00:20:38.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:38.188 "is_configured": false, 00:20:38.188 "data_offset": 0, 00:20:38.188 "data_size": 0 00:20:38.188 } 00:20:38.188 ] 00:20:38.188 }' 00:20:38.188 22:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.188 22:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:39.118 [2024-07-15 22:49:23.824578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:39.118 BaseBdev2 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:39.118 22:49:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:39.375 22:49:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:39.641 [ 00:20:39.641 { 00:20:39.641 "name": "BaseBdev2", 00:20:39.641 "aliases": [ 00:20:39.641 "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06" 00:20:39.641 ], 00:20:39.641 "product_name": "Malloc disk", 00:20:39.641 "block_size": 512, 00:20:39.641 "num_blocks": 65536, 00:20:39.641 "uuid": "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06", 00:20:39.641 "assigned_rate_limits": { 00:20:39.641 "rw_ios_per_sec": 0, 00:20:39.641 "rw_mbytes_per_sec": 0, 00:20:39.641 "r_mbytes_per_sec": 0, 00:20:39.641 "w_mbytes_per_sec": 0 00:20:39.641 }, 00:20:39.641 "claimed": true, 00:20:39.641 "claim_type": "exclusive_write", 00:20:39.641 "zoned": false, 00:20:39.641 "supported_io_types": { 00:20:39.641 "read": true, 00:20:39.641 "write": true, 00:20:39.641 "unmap": true, 00:20:39.641 "flush": true, 00:20:39.641 "reset": true, 00:20:39.641 "nvme_admin": false, 00:20:39.641 "nvme_io": false, 00:20:39.641 "nvme_io_md": false, 00:20:39.641 "write_zeroes": true, 00:20:39.641 "zcopy": true, 00:20:39.641 "get_zone_info": false, 00:20:39.641 "zone_management": false, 00:20:39.641 "zone_append": false, 00:20:39.641 "compare": false, 00:20:39.641 "compare_and_write": false, 00:20:39.641 "abort": true, 00:20:39.641 "seek_hole": false, 00:20:39.641 "seek_data": false, 00:20:39.641 "copy": true, 00:20:39.641 "nvme_iov_md": false 00:20:39.641 }, 00:20:39.641 "memory_domains": [ 00:20:39.641 { 00:20:39.641 "dma_device_id": "system", 00:20:39.641 "dma_device_type": 1 00:20:39.641 }, 00:20:39.641 { 00:20:39.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.641 "dma_device_type": 2 00:20:39.641 } 00:20:39.641 ], 00:20:39.641 "driver_specific": {} 00:20:39.641 } 00:20:39.641 ] 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.641 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:39.904 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.904 "name": "Existed_Raid", 00:20:39.904 "uuid": "3817182f-1d23-4ac3-ae67-02643128bf11", 00:20:39.904 "strip_size_kb": 64, 00:20:39.904 "state": "configuring", 00:20:39.904 "raid_level": "concat", 00:20:39.904 "superblock": true, 00:20:39.904 "num_base_bdevs": 4, 00:20:39.904 "num_base_bdevs_discovered": 2, 00:20:39.904 "num_base_bdevs_operational": 4, 00:20:39.904 "base_bdevs_list": [ 00:20:39.904 { 00:20:39.904 "name": "BaseBdev1", 00:20:39.904 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:39.904 "is_configured": true, 00:20:39.904 "data_offset": 2048, 00:20:39.904 "data_size": 63488 00:20:39.904 }, 00:20:39.904 { 00:20:39.904 "name": "BaseBdev2", 00:20:39.904 "uuid": "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06", 00:20:39.904 "is_configured": true, 00:20:39.904 "data_offset": 2048, 00:20:39.904 "data_size": 63488 00:20:39.904 }, 00:20:39.904 { 00:20:39.904 "name": "BaseBdev3", 00:20:39.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.904 "is_configured": false, 00:20:39.904 "data_offset": 0, 00:20:39.904 "data_size": 0 00:20:39.904 }, 00:20:39.904 { 00:20:39.904 "name": "BaseBdev4", 00:20:39.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.904 "is_configured": false, 00:20:39.904 "data_offset": 0, 00:20:39.904 "data_size": 0 00:20:39.904 } 00:20:39.904 ] 00:20:39.904 }' 00:20:39.904 22:49:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.904 22:49:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:40.468 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:40.726 [2024-07-15 22:49:25.388142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:40.726 BaseBdev3 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:40.726 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:40.983 [ 00:20:40.983 { 00:20:40.983 "name": "BaseBdev3", 00:20:40.983 "aliases": [ 00:20:40.983 "98d4bd30-768b-4ee7-b7f5-6e8775b89047" 00:20:40.983 ], 00:20:40.983 "product_name": "Malloc disk", 00:20:40.983 "block_size": 512, 00:20:40.983 "num_blocks": 65536, 00:20:40.983 "uuid": "98d4bd30-768b-4ee7-b7f5-6e8775b89047", 00:20:40.983 "assigned_rate_limits": { 00:20:40.983 "rw_ios_per_sec": 0, 00:20:40.983 "rw_mbytes_per_sec": 0, 00:20:40.983 "r_mbytes_per_sec": 0, 00:20:40.983 "w_mbytes_per_sec": 0 00:20:40.983 }, 00:20:40.983 "claimed": true, 00:20:40.983 "claim_type": "exclusive_write", 00:20:40.983 "zoned": false, 00:20:40.983 "supported_io_types": { 00:20:40.983 "read": true, 00:20:40.983 "write": true, 00:20:40.983 "unmap": true, 00:20:40.983 "flush": true, 00:20:40.983 "reset": true, 00:20:40.983 "nvme_admin": false, 00:20:40.983 "nvme_io": false, 00:20:40.983 "nvme_io_md": false, 00:20:40.983 "write_zeroes": true, 00:20:40.983 "zcopy": true, 00:20:40.983 "get_zone_info": false, 00:20:40.983 "zone_management": false, 00:20:40.983 "zone_append": false, 00:20:40.983 "compare": false, 00:20:40.983 "compare_and_write": false, 00:20:40.983 "abort": true, 00:20:40.983 "seek_hole": false, 00:20:40.983 "seek_data": false, 00:20:40.983 "copy": true, 00:20:40.983 "nvme_iov_md": false 00:20:40.983 }, 00:20:40.983 "memory_domains": [ 00:20:40.983 { 00:20:40.983 "dma_device_id": "system", 00:20:40.983 "dma_device_type": 1 00:20:40.983 }, 00:20:40.983 { 00:20:40.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.983 "dma_device_type": 2 00:20:40.983 } 00:20:40.983 ], 00:20:40.983 "driver_specific": {} 00:20:40.983 } 00:20:40.983 ] 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.983 22:49:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:41.240 22:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.240 "name": "Existed_Raid", 00:20:41.240 "uuid": "3817182f-1d23-4ac3-ae67-02643128bf11", 00:20:41.240 "strip_size_kb": 64, 00:20:41.240 "state": "configuring", 00:20:41.240 "raid_level": "concat", 00:20:41.240 "superblock": true, 00:20:41.240 "num_base_bdevs": 4, 00:20:41.240 "num_base_bdevs_discovered": 3, 00:20:41.240 "num_base_bdevs_operational": 4, 00:20:41.240 "base_bdevs_list": [ 00:20:41.240 { 00:20:41.240 "name": "BaseBdev1", 00:20:41.240 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:41.240 "is_configured": true, 00:20:41.240 "data_offset": 2048, 00:20:41.240 "data_size": 63488 00:20:41.240 }, 00:20:41.240 { 00:20:41.240 "name": "BaseBdev2", 00:20:41.240 "uuid": "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06", 00:20:41.240 "is_configured": true, 00:20:41.240 "data_offset": 2048, 00:20:41.240 "data_size": 63488 00:20:41.240 }, 00:20:41.240 { 00:20:41.240 "name": "BaseBdev3", 00:20:41.240 "uuid": "98d4bd30-768b-4ee7-b7f5-6e8775b89047", 00:20:41.240 "is_configured": true, 00:20:41.240 "data_offset": 2048, 00:20:41.240 "data_size": 63488 00:20:41.240 }, 00:20:41.240 { 00:20:41.240 "name": "BaseBdev4", 00:20:41.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.240 "is_configured": false, 00:20:41.240 "data_offset": 0, 00:20:41.240 "data_size": 0 00:20:41.241 } 00:20:41.241 ] 00:20:41.241 }' 00:20:41.241 22:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.241 22:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.805 22:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:42.063 [2024-07-15 22:49:26.800456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:42.063 [2024-07-15 22:49:26.800679] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25f1350 00:20:42.063 [2024-07-15 22:49:26.800702] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:42.063 [2024-07-15 22:49:26.800921] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25f1020 00:20:42.063 [2024-07-15 22:49:26.801086] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25f1350 00:20:42.063 [2024-07-15 22:49:26.801106] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25f1350 00:20:42.063 [2024-07-15 22:49:26.801223] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.063 BaseBdev4 00:20:42.063 22:49:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:42.063 22:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:42.063 22:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:42.063 22:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:42.063 22:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:42.063 22:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:42.063 22:49:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:42.321 [ 00:20:42.321 { 00:20:42.321 "name": "BaseBdev4", 00:20:42.321 "aliases": [ 00:20:42.321 "f2d7c1de-ba8d-4f95-9db8-326892a45e3e" 00:20:42.321 ], 00:20:42.321 "product_name": "Malloc disk", 00:20:42.321 "block_size": 512, 00:20:42.321 "num_blocks": 65536, 00:20:42.321 "uuid": "f2d7c1de-ba8d-4f95-9db8-326892a45e3e", 00:20:42.321 "assigned_rate_limits": { 00:20:42.321 "rw_ios_per_sec": 0, 00:20:42.321 "rw_mbytes_per_sec": 0, 00:20:42.321 "r_mbytes_per_sec": 0, 00:20:42.321 "w_mbytes_per_sec": 0 00:20:42.321 }, 00:20:42.321 "claimed": true, 00:20:42.321 "claim_type": "exclusive_write", 00:20:42.321 "zoned": false, 00:20:42.321 "supported_io_types": { 00:20:42.321 "read": true, 00:20:42.321 "write": true, 00:20:42.321 "unmap": true, 00:20:42.321 "flush": true, 00:20:42.321 "reset": true, 00:20:42.321 "nvme_admin": false, 00:20:42.321 "nvme_io": false, 00:20:42.321 "nvme_io_md": false, 00:20:42.321 "write_zeroes": true, 00:20:42.321 "zcopy": true, 00:20:42.321 "get_zone_info": false, 00:20:42.321 "zone_management": false, 00:20:42.321 "zone_append": false, 00:20:42.321 "compare": false, 00:20:42.321 "compare_and_write": false, 00:20:42.321 "abort": true, 00:20:42.321 "seek_hole": false, 00:20:42.321 "seek_data": false, 00:20:42.321 "copy": true, 00:20:42.321 "nvme_iov_md": false 00:20:42.321 }, 00:20:42.321 "memory_domains": [ 00:20:42.321 { 00:20:42.321 "dma_device_id": "system", 00:20:42.321 "dma_device_type": 1 00:20:42.321 }, 00:20:42.321 { 00:20:42.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.321 "dma_device_type": 2 00:20:42.321 } 00:20:42.321 ], 00:20:42.321 "driver_specific": {} 00:20:42.321 } 00:20:42.321 ] 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.321 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:42.888 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.888 "name": "Existed_Raid", 00:20:42.888 "uuid": "3817182f-1d23-4ac3-ae67-02643128bf11", 00:20:42.888 "strip_size_kb": 64, 00:20:42.888 "state": "online", 00:20:42.888 "raid_level": "concat", 00:20:42.888 "superblock": true, 00:20:42.888 "num_base_bdevs": 4, 00:20:42.888 "num_base_bdevs_discovered": 4, 00:20:42.888 "num_base_bdevs_operational": 4, 00:20:42.888 "base_bdevs_list": [ 00:20:42.888 { 00:20:42.888 "name": "BaseBdev1", 00:20:42.888 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:42.888 "is_configured": true, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 }, 00:20:42.888 { 00:20:42.888 "name": "BaseBdev2", 00:20:42.888 "uuid": "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06", 00:20:42.888 "is_configured": true, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 }, 00:20:42.888 { 00:20:42.888 "name": "BaseBdev3", 00:20:42.888 "uuid": "98d4bd30-768b-4ee7-b7f5-6e8775b89047", 00:20:42.888 "is_configured": true, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 }, 00:20:42.888 { 00:20:42.888 "name": "BaseBdev4", 00:20:42.888 "uuid": "f2d7c1de-ba8d-4f95-9db8-326892a45e3e", 00:20:42.888 "is_configured": true, 00:20:42.888 "data_offset": 2048, 00:20:42.888 "data_size": 63488 00:20:42.888 } 00:20:42.888 ] 00:20:42.888 }' 00:20:42.888 22:49:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.888 22:49:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:43.454 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:43.713 [2024-07-15 22:49:28.453186] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:43.713 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:43.713 "name": "Existed_Raid", 00:20:43.713 "aliases": [ 00:20:43.713 "3817182f-1d23-4ac3-ae67-02643128bf11" 00:20:43.713 ], 00:20:43.713 "product_name": "Raid Volume", 00:20:43.713 "block_size": 512, 00:20:43.713 "num_blocks": 253952, 00:20:43.713 "uuid": "3817182f-1d23-4ac3-ae67-02643128bf11", 00:20:43.713 "assigned_rate_limits": { 00:20:43.713 "rw_ios_per_sec": 0, 00:20:43.713 "rw_mbytes_per_sec": 0, 00:20:43.713 "r_mbytes_per_sec": 0, 00:20:43.713 "w_mbytes_per_sec": 0 00:20:43.713 }, 00:20:43.713 "claimed": false, 00:20:43.713 "zoned": false, 00:20:43.713 "supported_io_types": { 00:20:43.713 "read": true, 00:20:43.713 "write": true, 00:20:43.713 "unmap": true, 00:20:43.713 "flush": true, 00:20:43.713 "reset": true, 00:20:43.713 "nvme_admin": false, 00:20:43.713 "nvme_io": false, 00:20:43.713 "nvme_io_md": false, 00:20:43.713 "write_zeroes": true, 00:20:43.713 "zcopy": false, 00:20:43.713 "get_zone_info": false, 00:20:43.713 "zone_management": false, 00:20:43.713 "zone_append": false, 00:20:43.713 "compare": false, 00:20:43.713 "compare_and_write": false, 00:20:43.713 "abort": false, 00:20:43.713 "seek_hole": false, 00:20:43.713 "seek_data": false, 00:20:43.713 "copy": false, 00:20:43.713 "nvme_iov_md": false 00:20:43.713 }, 00:20:43.713 "memory_domains": [ 00:20:43.713 { 00:20:43.713 "dma_device_id": "system", 00:20:43.713 "dma_device_type": 1 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.713 "dma_device_type": 2 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "dma_device_id": "system", 00:20:43.713 "dma_device_type": 1 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.713 "dma_device_type": 2 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "dma_device_id": "system", 00:20:43.713 "dma_device_type": 1 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.713 "dma_device_type": 2 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "dma_device_id": "system", 00:20:43.713 "dma_device_type": 1 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.713 "dma_device_type": 2 00:20:43.713 } 00:20:43.713 ], 00:20:43.713 "driver_specific": { 00:20:43.713 "raid": { 00:20:43.713 "uuid": "3817182f-1d23-4ac3-ae67-02643128bf11", 00:20:43.713 "strip_size_kb": 64, 00:20:43.713 "state": "online", 00:20:43.713 "raid_level": "concat", 00:20:43.713 "superblock": true, 00:20:43.713 "num_base_bdevs": 4, 00:20:43.713 "num_base_bdevs_discovered": 4, 00:20:43.713 "num_base_bdevs_operational": 4, 00:20:43.713 "base_bdevs_list": [ 00:20:43.713 { 00:20:43.713 "name": "BaseBdev1", 00:20:43.713 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:43.713 "is_configured": true, 00:20:43.713 "data_offset": 2048, 00:20:43.713 "data_size": 63488 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "name": "BaseBdev2", 00:20:43.713 "uuid": "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06", 00:20:43.713 "is_configured": true, 00:20:43.713 "data_offset": 2048, 00:20:43.713 "data_size": 63488 00:20:43.713 }, 00:20:43.713 { 00:20:43.713 "name": "BaseBdev3", 00:20:43.713 "uuid": "98d4bd30-768b-4ee7-b7f5-6e8775b89047", 00:20:43.713 "is_configured": true, 00:20:43.713 "data_offset": 2048, 00:20:43.713 "data_size": 63488 00:20:43.713 }, 00:20:43.713 { 00:20:43.714 "name": "BaseBdev4", 00:20:43.714 "uuid": "f2d7c1de-ba8d-4f95-9db8-326892a45e3e", 00:20:43.714 "is_configured": true, 00:20:43.714 "data_offset": 2048, 00:20:43.714 "data_size": 63488 00:20:43.714 } 00:20:43.714 ] 00:20:43.714 } 00:20:43.714 } 00:20:43.714 }' 00:20:43.714 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:43.714 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:43.714 BaseBdev2 00:20:43.714 BaseBdev3 00:20:43.714 BaseBdev4' 00:20:43.714 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:43.714 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:43.714 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:43.972 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:43.972 "name": "BaseBdev1", 00:20:43.972 "aliases": [ 00:20:43.972 "55ff1e9d-f288-4055-9c57-a0a24ac9c447" 00:20:43.972 ], 00:20:43.972 "product_name": "Malloc disk", 00:20:43.972 "block_size": 512, 00:20:43.972 "num_blocks": 65536, 00:20:43.972 "uuid": "55ff1e9d-f288-4055-9c57-a0a24ac9c447", 00:20:43.972 "assigned_rate_limits": { 00:20:43.972 "rw_ios_per_sec": 0, 00:20:43.972 "rw_mbytes_per_sec": 0, 00:20:43.972 "r_mbytes_per_sec": 0, 00:20:43.972 "w_mbytes_per_sec": 0 00:20:43.972 }, 00:20:43.972 "claimed": true, 00:20:43.972 "claim_type": "exclusive_write", 00:20:43.972 "zoned": false, 00:20:43.972 "supported_io_types": { 00:20:43.972 "read": true, 00:20:43.972 "write": true, 00:20:43.972 "unmap": true, 00:20:43.972 "flush": true, 00:20:43.972 "reset": true, 00:20:43.972 "nvme_admin": false, 00:20:43.972 "nvme_io": false, 00:20:43.972 "nvme_io_md": false, 00:20:43.972 "write_zeroes": true, 00:20:43.972 "zcopy": true, 00:20:43.972 "get_zone_info": false, 00:20:43.972 "zone_management": false, 00:20:43.972 "zone_append": false, 00:20:43.972 "compare": false, 00:20:43.972 "compare_and_write": false, 00:20:43.972 "abort": true, 00:20:43.972 "seek_hole": false, 00:20:43.972 "seek_data": false, 00:20:43.972 "copy": true, 00:20:43.972 "nvme_iov_md": false 00:20:43.972 }, 00:20:43.972 "memory_domains": [ 00:20:43.972 { 00:20:43.972 "dma_device_id": "system", 00:20:43.972 "dma_device_type": 1 00:20:43.972 }, 00:20:43.972 { 00:20:43.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.972 "dma_device_type": 2 00:20:43.972 } 00:20:43.972 ], 00:20:43.972 "driver_specific": {} 00:20:43.972 }' 00:20:43.972 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.972 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.243 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:44.243 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.243 22:49:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.243 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:44.243 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.243 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.557 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:44.557 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.557 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.557 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:44.557 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:44.557 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:44.557 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:45.124 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:45.124 "name": "BaseBdev2", 00:20:45.124 "aliases": [ 00:20:45.124 "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06" 00:20:45.124 ], 00:20:45.124 "product_name": "Malloc disk", 00:20:45.124 "block_size": 512, 00:20:45.124 "num_blocks": 65536, 00:20:45.124 "uuid": "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06", 00:20:45.124 "assigned_rate_limits": { 00:20:45.124 "rw_ios_per_sec": 0, 00:20:45.124 "rw_mbytes_per_sec": 0, 00:20:45.124 "r_mbytes_per_sec": 0, 00:20:45.124 "w_mbytes_per_sec": 0 00:20:45.124 }, 00:20:45.124 "claimed": true, 00:20:45.124 "claim_type": "exclusive_write", 00:20:45.124 "zoned": false, 00:20:45.124 "supported_io_types": { 00:20:45.124 "read": true, 00:20:45.124 "write": true, 00:20:45.124 "unmap": true, 00:20:45.124 "flush": true, 00:20:45.124 "reset": true, 00:20:45.124 "nvme_admin": false, 00:20:45.124 "nvme_io": false, 00:20:45.124 "nvme_io_md": false, 00:20:45.124 "write_zeroes": true, 00:20:45.124 "zcopy": true, 00:20:45.124 "get_zone_info": false, 00:20:45.124 "zone_management": false, 00:20:45.124 "zone_append": false, 00:20:45.124 "compare": false, 00:20:45.124 "compare_and_write": false, 00:20:45.124 "abort": true, 00:20:45.124 "seek_hole": false, 00:20:45.124 "seek_data": false, 00:20:45.124 "copy": true, 00:20:45.124 "nvme_iov_md": false 00:20:45.124 }, 00:20:45.124 "memory_domains": [ 00:20:45.124 { 00:20:45.124 "dma_device_id": "system", 00:20:45.124 "dma_device_type": 1 00:20:45.124 }, 00:20:45.124 { 00:20:45.124 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.124 "dma_device_type": 2 00:20:45.124 } 00:20:45.125 ], 00:20:45.125 "driver_specific": {} 00:20:45.125 }' 00:20:45.125 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:45.125 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:45.125 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:45.125 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.125 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.125 22:49:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:45.125 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:45.384 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:45.643 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:45.643 "name": "BaseBdev3", 00:20:45.643 "aliases": [ 00:20:45.643 "98d4bd30-768b-4ee7-b7f5-6e8775b89047" 00:20:45.643 ], 00:20:45.643 "product_name": "Malloc disk", 00:20:45.643 "block_size": 512, 00:20:45.644 "num_blocks": 65536, 00:20:45.644 "uuid": "98d4bd30-768b-4ee7-b7f5-6e8775b89047", 00:20:45.644 "assigned_rate_limits": { 00:20:45.644 "rw_ios_per_sec": 0, 00:20:45.644 "rw_mbytes_per_sec": 0, 00:20:45.644 "r_mbytes_per_sec": 0, 00:20:45.644 "w_mbytes_per_sec": 0 00:20:45.644 }, 00:20:45.644 "claimed": true, 00:20:45.644 "claim_type": "exclusive_write", 00:20:45.644 "zoned": false, 00:20:45.644 "supported_io_types": { 00:20:45.644 "read": true, 00:20:45.644 "write": true, 00:20:45.644 "unmap": true, 00:20:45.644 "flush": true, 00:20:45.644 "reset": true, 00:20:45.644 "nvme_admin": false, 00:20:45.644 "nvme_io": false, 00:20:45.644 "nvme_io_md": false, 00:20:45.644 "write_zeroes": true, 00:20:45.644 "zcopy": true, 00:20:45.644 "get_zone_info": false, 00:20:45.644 "zone_management": false, 00:20:45.644 "zone_append": false, 00:20:45.644 "compare": false, 00:20:45.644 "compare_and_write": false, 00:20:45.644 "abort": true, 00:20:45.644 "seek_hole": false, 00:20:45.644 "seek_data": false, 00:20:45.644 "copy": true, 00:20:45.644 "nvme_iov_md": false 00:20:45.644 }, 00:20:45.644 "memory_domains": [ 00:20:45.644 { 00:20:45.644 "dma_device_id": "system", 00:20:45.644 "dma_device_type": 1 00:20:45.644 }, 00:20:45.644 { 00:20:45.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:45.644 "dma_device_type": 2 00:20:45.644 } 00:20:45.644 ], 00:20:45.644 "driver_specific": {} 00:20:45.644 }' 00:20:45.644 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:45.644 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:45.902 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.160 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.160 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:46.160 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:46.160 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:46.160 22:49:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:46.728 "name": "BaseBdev4", 00:20:46.728 "aliases": [ 00:20:46.728 "f2d7c1de-ba8d-4f95-9db8-326892a45e3e" 00:20:46.728 ], 00:20:46.728 "product_name": "Malloc disk", 00:20:46.728 "block_size": 512, 00:20:46.728 "num_blocks": 65536, 00:20:46.728 "uuid": "f2d7c1de-ba8d-4f95-9db8-326892a45e3e", 00:20:46.728 "assigned_rate_limits": { 00:20:46.728 "rw_ios_per_sec": 0, 00:20:46.728 "rw_mbytes_per_sec": 0, 00:20:46.728 "r_mbytes_per_sec": 0, 00:20:46.728 "w_mbytes_per_sec": 0 00:20:46.728 }, 00:20:46.728 "claimed": true, 00:20:46.728 "claim_type": "exclusive_write", 00:20:46.728 "zoned": false, 00:20:46.728 "supported_io_types": { 00:20:46.728 "read": true, 00:20:46.728 "write": true, 00:20:46.728 "unmap": true, 00:20:46.728 "flush": true, 00:20:46.728 "reset": true, 00:20:46.728 "nvme_admin": false, 00:20:46.728 "nvme_io": false, 00:20:46.728 "nvme_io_md": false, 00:20:46.728 "write_zeroes": true, 00:20:46.728 "zcopy": true, 00:20:46.728 "get_zone_info": false, 00:20:46.728 "zone_management": false, 00:20:46.728 "zone_append": false, 00:20:46.728 "compare": false, 00:20:46.728 "compare_and_write": false, 00:20:46.728 "abort": true, 00:20:46.728 "seek_hole": false, 00:20:46.728 "seek_data": false, 00:20:46.728 "copy": true, 00:20:46.728 "nvme_iov_md": false 00:20:46.728 }, 00:20:46.728 "memory_domains": [ 00:20:46.728 { 00:20:46.728 "dma_device_id": "system", 00:20:46.728 "dma_device_type": 1 00:20:46.728 }, 00:20:46.728 { 00:20:46.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:46.728 "dma_device_type": 2 00:20:46.728 } 00:20:46.728 ], 00:20:46.728 "driver_specific": {} 00:20:46.728 }' 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:46.728 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.986 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:46.986 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:46.986 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.986 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:46.986 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:46.986 22:49:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:47.245 [2024-07-15 22:49:32.070524] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:47.245 [2024-07-15 22:49:32.070557] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:47.245 [2024-07-15 22:49:32.070610] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.245 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.504 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.504 "name": "Existed_Raid", 00:20:47.504 "uuid": "3817182f-1d23-4ac3-ae67-02643128bf11", 00:20:47.504 "strip_size_kb": 64, 00:20:47.504 "state": "offline", 00:20:47.504 "raid_level": "concat", 00:20:47.504 "superblock": true, 00:20:47.504 "num_base_bdevs": 4, 00:20:47.504 "num_base_bdevs_discovered": 3, 00:20:47.504 "num_base_bdevs_operational": 3, 00:20:47.504 "base_bdevs_list": [ 00:20:47.504 { 00:20:47.504 "name": null, 00:20:47.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.504 "is_configured": false, 00:20:47.504 "data_offset": 2048, 00:20:47.504 "data_size": 63488 00:20:47.504 }, 00:20:47.504 { 00:20:47.504 "name": "BaseBdev2", 00:20:47.504 "uuid": "ec20cf87-35e9-4ce9-a1cd-7be5b6013a06", 00:20:47.504 "is_configured": true, 00:20:47.504 "data_offset": 2048, 00:20:47.504 "data_size": 63488 00:20:47.504 }, 00:20:47.504 { 00:20:47.504 "name": "BaseBdev3", 00:20:47.504 "uuid": "98d4bd30-768b-4ee7-b7f5-6e8775b89047", 00:20:47.504 "is_configured": true, 00:20:47.504 "data_offset": 2048, 00:20:47.504 "data_size": 63488 00:20:47.504 }, 00:20:47.504 { 00:20:47.504 "name": "BaseBdev4", 00:20:47.504 "uuid": "f2d7c1de-ba8d-4f95-9db8-326892a45e3e", 00:20:47.504 "is_configured": true, 00:20:47.504 "data_offset": 2048, 00:20:47.504 "data_size": 63488 00:20:47.504 } 00:20:47.504 ] 00:20:47.504 }' 00:20:47.504 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.504 22:49:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:48.071 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:48.071 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:48.071 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.071 22:49:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:48.329 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:48.329 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:48.329 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:48.587 [2024-07-15 22:49:33.371910] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:48.587 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:48.587 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:48.587 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.587 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:48.844 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:48.845 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:48.845 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:49.102 [2024-07-15 22:49:33.924004] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:49.102 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:49.102 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.102 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:49.102 22:49:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.359 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:49.359 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:49.359 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:49.927 [2024-07-15 22:49:34.692719] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:49.927 [2024-07-15 22:49:34.692766] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f1350 name Existed_Raid, state offline 00:20:49.927 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:49.927 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:49.927 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.927 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:50.187 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:50.187 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:50.187 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:50.187 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:50.187 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:50.187 22:49:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:50.446 BaseBdev2 00:20:50.446 22:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:50.446 22:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:50.446 22:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:50.446 22:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:50.446 22:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:50.446 22:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:50.446 22:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:51.013 22:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:51.272 [ 00:20:51.272 { 00:20:51.272 "name": "BaseBdev2", 00:20:51.272 "aliases": [ 00:20:51.272 "19aa4243-0dc3-44ea-b1ce-f2726221012b" 00:20:51.272 ], 00:20:51.272 "product_name": "Malloc disk", 00:20:51.272 "block_size": 512, 00:20:51.272 "num_blocks": 65536, 00:20:51.272 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:20:51.272 "assigned_rate_limits": { 00:20:51.272 "rw_ios_per_sec": 0, 00:20:51.272 "rw_mbytes_per_sec": 0, 00:20:51.272 "r_mbytes_per_sec": 0, 00:20:51.272 "w_mbytes_per_sec": 0 00:20:51.272 }, 00:20:51.272 "claimed": false, 00:20:51.272 "zoned": false, 00:20:51.272 "supported_io_types": { 00:20:51.272 "read": true, 00:20:51.272 "write": true, 00:20:51.272 "unmap": true, 00:20:51.272 "flush": true, 00:20:51.272 "reset": true, 00:20:51.272 "nvme_admin": false, 00:20:51.272 "nvme_io": false, 00:20:51.272 "nvme_io_md": false, 00:20:51.272 "write_zeroes": true, 00:20:51.272 "zcopy": true, 00:20:51.272 "get_zone_info": false, 00:20:51.272 "zone_management": false, 00:20:51.272 "zone_append": false, 00:20:51.272 "compare": false, 00:20:51.272 "compare_and_write": false, 00:20:51.272 "abort": true, 00:20:51.272 "seek_hole": false, 00:20:51.272 "seek_data": false, 00:20:51.272 "copy": true, 00:20:51.272 "nvme_iov_md": false 00:20:51.272 }, 00:20:51.272 "memory_domains": [ 00:20:51.272 { 00:20:51.272 "dma_device_id": "system", 00:20:51.272 "dma_device_type": 1 00:20:51.272 }, 00:20:51.272 { 00:20:51.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.272 "dma_device_type": 2 00:20:51.272 } 00:20:51.272 ], 00:20:51.272 "driver_specific": {} 00:20:51.272 } 00:20:51.272 ] 00:20:51.272 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:51.272 22:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:51.272 22:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:51.272 22:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:51.532 BaseBdev3 00:20:51.532 22:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:51.532 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:51.532 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:51.532 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:51.532 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:51.532 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:51.532 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:51.791 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:52.051 [ 00:20:52.051 { 00:20:52.051 "name": "BaseBdev3", 00:20:52.051 "aliases": [ 00:20:52.051 "b2bc2d98-f0cc-4ebe-9250-747e749db0fd" 00:20:52.051 ], 00:20:52.051 "product_name": "Malloc disk", 00:20:52.051 "block_size": 512, 00:20:52.051 "num_blocks": 65536, 00:20:52.051 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:20:52.051 "assigned_rate_limits": { 00:20:52.051 "rw_ios_per_sec": 0, 00:20:52.051 "rw_mbytes_per_sec": 0, 00:20:52.051 "r_mbytes_per_sec": 0, 00:20:52.051 "w_mbytes_per_sec": 0 00:20:52.051 }, 00:20:52.051 "claimed": false, 00:20:52.051 "zoned": false, 00:20:52.051 "supported_io_types": { 00:20:52.051 "read": true, 00:20:52.051 "write": true, 00:20:52.051 "unmap": true, 00:20:52.051 "flush": true, 00:20:52.051 "reset": true, 00:20:52.051 "nvme_admin": false, 00:20:52.051 "nvme_io": false, 00:20:52.051 "nvme_io_md": false, 00:20:52.051 "write_zeroes": true, 00:20:52.051 "zcopy": true, 00:20:52.051 "get_zone_info": false, 00:20:52.051 "zone_management": false, 00:20:52.051 "zone_append": false, 00:20:52.051 "compare": false, 00:20:52.051 "compare_and_write": false, 00:20:52.051 "abort": true, 00:20:52.051 "seek_hole": false, 00:20:52.051 "seek_data": false, 00:20:52.051 "copy": true, 00:20:52.051 "nvme_iov_md": false 00:20:52.051 }, 00:20:52.051 "memory_domains": [ 00:20:52.051 { 00:20:52.051 "dma_device_id": "system", 00:20:52.051 "dma_device_type": 1 00:20:52.051 }, 00:20:52.051 { 00:20:52.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.051 "dma_device_type": 2 00:20:52.051 } 00:20:52.051 ], 00:20:52.051 "driver_specific": {} 00:20:52.051 } 00:20:52.051 ] 00:20:52.051 22:49:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:52.051 22:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:52.051 22:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:52.051 22:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:52.310 BaseBdev4 00:20:52.310 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:52.310 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:52.310 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:52.310 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:52.310 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:52.310 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:52.310 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:52.569 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:52.829 [ 00:20:52.829 { 00:20:52.829 "name": "BaseBdev4", 00:20:52.829 "aliases": [ 00:20:52.829 "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35" 00:20:52.829 ], 00:20:52.829 "product_name": "Malloc disk", 00:20:52.829 "block_size": 512, 00:20:52.829 "num_blocks": 65536, 00:20:52.829 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:20:52.829 "assigned_rate_limits": { 00:20:52.829 "rw_ios_per_sec": 0, 00:20:52.829 "rw_mbytes_per_sec": 0, 00:20:52.829 "r_mbytes_per_sec": 0, 00:20:52.829 "w_mbytes_per_sec": 0 00:20:52.829 }, 00:20:52.829 "claimed": false, 00:20:52.829 "zoned": false, 00:20:52.829 "supported_io_types": { 00:20:52.829 "read": true, 00:20:52.829 "write": true, 00:20:52.829 "unmap": true, 00:20:52.829 "flush": true, 00:20:52.829 "reset": true, 00:20:52.829 "nvme_admin": false, 00:20:52.829 "nvme_io": false, 00:20:52.829 "nvme_io_md": false, 00:20:52.829 "write_zeroes": true, 00:20:52.829 "zcopy": true, 00:20:52.829 "get_zone_info": false, 00:20:52.829 "zone_management": false, 00:20:52.829 "zone_append": false, 00:20:52.829 "compare": false, 00:20:52.829 "compare_and_write": false, 00:20:52.829 "abort": true, 00:20:52.829 "seek_hole": false, 00:20:52.829 "seek_data": false, 00:20:52.829 "copy": true, 00:20:52.829 "nvme_iov_md": false 00:20:52.829 }, 00:20:52.829 "memory_domains": [ 00:20:52.829 { 00:20:52.829 "dma_device_id": "system", 00:20:52.829 "dma_device_type": 1 00:20:52.829 }, 00:20:52.829 { 00:20:52.829 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:52.829 "dma_device_type": 2 00:20:52.829 } 00:20:52.829 ], 00:20:52.829 "driver_specific": {} 00:20:52.829 } 00:20:52.829 ] 00:20:52.829 22:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:52.829 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:52.829 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:52.829 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:53.089 [2024-07-15 22:49:37.822483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:53.089 [2024-07-15 22:49:37.822536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:53.089 [2024-07-15 22:49:37.822558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:53.089 [2024-07-15 22:49:37.823945] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:53.089 [2024-07-15 22:49:37.823990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.089 22:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.349 22:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.349 "name": "Existed_Raid", 00:20:53.349 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:20:53.349 "strip_size_kb": 64, 00:20:53.349 "state": "configuring", 00:20:53.349 "raid_level": "concat", 00:20:53.349 "superblock": true, 00:20:53.349 "num_base_bdevs": 4, 00:20:53.349 "num_base_bdevs_discovered": 3, 00:20:53.349 "num_base_bdevs_operational": 4, 00:20:53.349 "base_bdevs_list": [ 00:20:53.349 { 00:20:53.349 "name": "BaseBdev1", 00:20:53.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.349 "is_configured": false, 00:20:53.349 "data_offset": 0, 00:20:53.349 "data_size": 0 00:20:53.349 }, 00:20:53.349 { 00:20:53.349 "name": "BaseBdev2", 00:20:53.349 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:20:53.349 "is_configured": true, 00:20:53.349 "data_offset": 2048, 00:20:53.349 "data_size": 63488 00:20:53.349 }, 00:20:53.349 { 00:20:53.349 "name": "BaseBdev3", 00:20:53.349 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:20:53.349 "is_configured": true, 00:20:53.349 "data_offset": 2048, 00:20:53.349 "data_size": 63488 00:20:53.349 }, 00:20:53.349 { 00:20:53.349 "name": "BaseBdev4", 00:20:53.349 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:20:53.349 "is_configured": true, 00:20:53.349 "data_offset": 2048, 00:20:53.349 "data_size": 63488 00:20:53.349 } 00:20:53.349 ] 00:20:53.349 }' 00:20:53.349 22:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.349 22:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:54.285 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:54.544 [2024-07-15 22:49:39.426712] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.544 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.802 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.802 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.802 "name": "Existed_Raid", 00:20:54.802 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:20:54.802 "strip_size_kb": 64, 00:20:54.802 "state": "configuring", 00:20:54.802 "raid_level": "concat", 00:20:54.802 "superblock": true, 00:20:54.802 "num_base_bdevs": 4, 00:20:54.802 "num_base_bdevs_discovered": 2, 00:20:54.802 "num_base_bdevs_operational": 4, 00:20:54.802 "base_bdevs_list": [ 00:20:54.802 { 00:20:54.802 "name": "BaseBdev1", 00:20:54.802 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.802 "is_configured": false, 00:20:54.802 "data_offset": 0, 00:20:54.802 "data_size": 0 00:20:54.802 }, 00:20:54.802 { 00:20:54.802 "name": null, 00:20:54.802 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:20:54.802 "is_configured": false, 00:20:54.802 "data_offset": 2048, 00:20:54.802 "data_size": 63488 00:20:54.802 }, 00:20:54.802 { 00:20:54.802 "name": "BaseBdev3", 00:20:54.802 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:20:54.802 "is_configured": true, 00:20:54.802 "data_offset": 2048, 00:20:54.802 "data_size": 63488 00:20:54.802 }, 00:20:54.802 { 00:20:54.802 "name": "BaseBdev4", 00:20:54.802 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:20:54.802 "is_configured": true, 00:20:54.802 "data_offset": 2048, 00:20:54.802 "data_size": 63488 00:20:54.802 } 00:20:54.802 ] 00:20:54.802 }' 00:20:54.802 22:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.802 22:49:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:55.738 22:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.738 22:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:55.997 22:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:55.997 22:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:56.257 [2024-07-15 22:49:41.062603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:56.257 BaseBdev1 00:20:56.257 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:56.257 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:56.257 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:56.257 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:56.257 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:56.257 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:56.257 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:56.516 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:56.775 [ 00:20:56.775 { 00:20:56.775 "name": "BaseBdev1", 00:20:56.775 "aliases": [ 00:20:56.775 "24d05e66-9fdd-46d1-af32-bee96216ad97" 00:20:56.775 ], 00:20:56.775 "product_name": "Malloc disk", 00:20:56.775 "block_size": 512, 00:20:56.775 "num_blocks": 65536, 00:20:56.775 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:20:56.775 "assigned_rate_limits": { 00:20:56.775 "rw_ios_per_sec": 0, 00:20:56.775 "rw_mbytes_per_sec": 0, 00:20:56.775 "r_mbytes_per_sec": 0, 00:20:56.775 "w_mbytes_per_sec": 0 00:20:56.775 }, 00:20:56.775 "claimed": true, 00:20:56.775 "claim_type": "exclusive_write", 00:20:56.775 "zoned": false, 00:20:56.775 "supported_io_types": { 00:20:56.775 "read": true, 00:20:56.775 "write": true, 00:20:56.775 "unmap": true, 00:20:56.775 "flush": true, 00:20:56.775 "reset": true, 00:20:56.775 "nvme_admin": false, 00:20:56.775 "nvme_io": false, 00:20:56.775 "nvme_io_md": false, 00:20:56.775 "write_zeroes": true, 00:20:56.775 "zcopy": true, 00:20:56.775 "get_zone_info": false, 00:20:56.775 "zone_management": false, 00:20:56.775 "zone_append": false, 00:20:56.775 "compare": false, 00:20:56.775 "compare_and_write": false, 00:20:56.775 "abort": true, 00:20:56.775 "seek_hole": false, 00:20:56.775 "seek_data": false, 00:20:56.775 "copy": true, 00:20:56.775 "nvme_iov_md": false 00:20:56.775 }, 00:20:56.775 "memory_domains": [ 00:20:56.775 { 00:20:56.775 "dma_device_id": "system", 00:20:56.775 "dma_device_type": 1 00:20:56.775 }, 00:20:56.775 { 00:20:56.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.775 "dma_device_type": 2 00:20:56.775 } 00:20:56.775 ], 00:20:56.775 "driver_specific": {} 00:20:56.775 } 00:20:56.775 ] 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.775 22:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.342 22:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.342 "name": "Existed_Raid", 00:20:57.342 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:20:57.342 "strip_size_kb": 64, 00:20:57.342 "state": "configuring", 00:20:57.342 "raid_level": "concat", 00:20:57.342 "superblock": true, 00:20:57.342 "num_base_bdevs": 4, 00:20:57.342 "num_base_bdevs_discovered": 3, 00:20:57.342 "num_base_bdevs_operational": 4, 00:20:57.342 "base_bdevs_list": [ 00:20:57.342 { 00:20:57.342 "name": "BaseBdev1", 00:20:57.343 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:20:57.343 "is_configured": true, 00:20:57.343 "data_offset": 2048, 00:20:57.343 "data_size": 63488 00:20:57.343 }, 00:20:57.343 { 00:20:57.343 "name": null, 00:20:57.343 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:20:57.343 "is_configured": false, 00:20:57.343 "data_offset": 2048, 00:20:57.343 "data_size": 63488 00:20:57.343 }, 00:20:57.343 { 00:20:57.343 "name": "BaseBdev3", 00:20:57.343 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:20:57.343 "is_configured": true, 00:20:57.343 "data_offset": 2048, 00:20:57.343 "data_size": 63488 00:20:57.343 }, 00:20:57.343 { 00:20:57.343 "name": "BaseBdev4", 00:20:57.343 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:20:57.343 "is_configured": true, 00:20:57.343 "data_offset": 2048, 00:20:57.343 "data_size": 63488 00:20:57.343 } 00:20:57.343 ] 00:20:57.343 }' 00:20:57.343 22:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.343 22:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:57.910 22:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:57.910 22:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.168 22:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:58.168 22:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:58.736 [2024-07-15 22:49:43.436920] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.736 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.304 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.304 "name": "Existed_Raid", 00:20:59.304 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:20:59.304 "strip_size_kb": 64, 00:20:59.304 "state": "configuring", 00:20:59.304 "raid_level": "concat", 00:20:59.304 "superblock": true, 00:20:59.304 "num_base_bdevs": 4, 00:20:59.304 "num_base_bdevs_discovered": 2, 00:20:59.304 "num_base_bdevs_operational": 4, 00:20:59.304 "base_bdevs_list": [ 00:20:59.304 { 00:20:59.304 "name": "BaseBdev1", 00:20:59.304 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:20:59.304 "is_configured": true, 00:20:59.304 "data_offset": 2048, 00:20:59.304 "data_size": 63488 00:20:59.304 }, 00:20:59.304 { 00:20:59.304 "name": null, 00:20:59.304 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:20:59.304 "is_configured": false, 00:20:59.304 "data_offset": 2048, 00:20:59.304 "data_size": 63488 00:20:59.304 }, 00:20:59.304 { 00:20:59.304 "name": null, 00:20:59.304 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:20:59.304 "is_configured": false, 00:20:59.304 "data_offset": 2048, 00:20:59.304 "data_size": 63488 00:20:59.304 }, 00:20:59.304 { 00:20:59.304 "name": "BaseBdev4", 00:20:59.304 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:20:59.304 "is_configured": true, 00:20:59.304 "data_offset": 2048, 00:20:59.304 "data_size": 63488 00:20:59.304 } 00:20:59.304 ] 00:20:59.304 }' 00:20:59.304 22:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.304 22:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:59.871 22:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.871 22:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:00.129 22:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:00.129 22:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:00.694 [2024-07-15 22:49:45.374105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.694 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.952 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.952 "name": "Existed_Raid", 00:21:00.952 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:21:00.952 "strip_size_kb": 64, 00:21:00.952 "state": "configuring", 00:21:00.952 "raid_level": "concat", 00:21:00.952 "superblock": true, 00:21:00.952 "num_base_bdevs": 4, 00:21:00.952 "num_base_bdevs_discovered": 3, 00:21:00.952 "num_base_bdevs_operational": 4, 00:21:00.952 "base_bdevs_list": [ 00:21:00.952 { 00:21:00.952 "name": "BaseBdev1", 00:21:00.952 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:21:00.952 "is_configured": true, 00:21:00.952 "data_offset": 2048, 00:21:00.952 "data_size": 63488 00:21:00.952 }, 00:21:00.952 { 00:21:00.952 "name": null, 00:21:00.952 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:21:00.952 "is_configured": false, 00:21:00.952 "data_offset": 2048, 00:21:00.953 "data_size": 63488 00:21:00.953 }, 00:21:00.953 { 00:21:00.953 "name": "BaseBdev3", 00:21:00.953 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:21:00.953 "is_configured": true, 00:21:00.953 "data_offset": 2048, 00:21:00.953 "data_size": 63488 00:21:00.953 }, 00:21:00.953 { 00:21:00.953 "name": "BaseBdev4", 00:21:00.953 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:21:00.953 "is_configured": true, 00:21:00.953 "data_offset": 2048, 00:21:00.953 "data_size": 63488 00:21:00.953 } 00:21:00.953 ] 00:21:00.953 }' 00:21:00.953 22:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.953 22:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:01.520 22:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.520 22:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:01.799 22:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:01.799 22:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:02.363 [2024-07-15 22:49:47.018480] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.363 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.364 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.364 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.364 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.929 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.929 "name": "Existed_Raid", 00:21:02.929 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:21:02.929 "strip_size_kb": 64, 00:21:02.929 "state": "configuring", 00:21:02.929 "raid_level": "concat", 00:21:02.929 "superblock": true, 00:21:02.929 "num_base_bdevs": 4, 00:21:02.929 "num_base_bdevs_discovered": 2, 00:21:02.929 "num_base_bdevs_operational": 4, 00:21:02.929 "base_bdevs_list": [ 00:21:02.929 { 00:21:02.929 "name": null, 00:21:02.929 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:21:02.929 "is_configured": false, 00:21:02.929 "data_offset": 2048, 00:21:02.929 "data_size": 63488 00:21:02.929 }, 00:21:02.929 { 00:21:02.929 "name": null, 00:21:02.929 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:21:02.929 "is_configured": false, 00:21:02.929 "data_offset": 2048, 00:21:02.929 "data_size": 63488 00:21:02.929 }, 00:21:02.929 { 00:21:02.929 "name": "BaseBdev3", 00:21:02.929 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:21:02.929 "is_configured": true, 00:21:02.929 "data_offset": 2048, 00:21:02.929 "data_size": 63488 00:21:02.929 }, 00:21:02.929 { 00:21:02.929 "name": "BaseBdev4", 00:21:02.929 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:21:02.929 "is_configured": true, 00:21:02.929 "data_offset": 2048, 00:21:02.929 "data_size": 63488 00:21:02.929 } 00:21:02.929 ] 00:21:02.929 }' 00:21:02.929 22:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.929 22:49:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:03.865 22:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.865 22:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:03.865 22:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:03.865 22:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:04.433 [2024-07-15 22:49:49.195121] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.433 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:04.693 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:04.693 "name": "Existed_Raid", 00:21:04.693 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:21:04.693 "strip_size_kb": 64, 00:21:04.693 "state": "configuring", 00:21:04.693 "raid_level": "concat", 00:21:04.693 "superblock": true, 00:21:04.693 "num_base_bdevs": 4, 00:21:04.693 "num_base_bdevs_discovered": 3, 00:21:04.693 "num_base_bdevs_operational": 4, 00:21:04.693 "base_bdevs_list": [ 00:21:04.693 { 00:21:04.693 "name": null, 00:21:04.693 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:21:04.693 "is_configured": false, 00:21:04.693 "data_offset": 2048, 00:21:04.693 "data_size": 63488 00:21:04.693 }, 00:21:04.693 { 00:21:04.693 "name": "BaseBdev2", 00:21:04.693 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:21:04.693 "is_configured": true, 00:21:04.693 "data_offset": 2048, 00:21:04.693 "data_size": 63488 00:21:04.693 }, 00:21:04.693 { 00:21:04.693 "name": "BaseBdev3", 00:21:04.693 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:21:04.693 "is_configured": true, 00:21:04.693 "data_offset": 2048, 00:21:04.693 "data_size": 63488 00:21:04.693 }, 00:21:04.693 { 00:21:04.693 "name": "BaseBdev4", 00:21:04.693 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:21:04.693 "is_configured": true, 00:21:04.693 "data_offset": 2048, 00:21:04.693 "data_size": 63488 00:21:04.693 } 00:21:04.693 ] 00:21:04.693 }' 00:21:04.693 22:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:04.693 22:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:05.262 22:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.262 22:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:05.521 22:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:05.521 22:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.521 22:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:05.781 22:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 24d05e66-9fdd-46d1-af32-bee96216ad97 00:21:06.041 [2024-07-15 22:49:50.747493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:06.041 [2024-07-15 22:49:50.747669] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25f3850 00:21:06.041 [2024-07-15 22:49:50.747682] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:06.041 [2024-07-15 22:49:50.747860] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e9d80 00:21:06.041 [2024-07-15 22:49:50.747985] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25f3850 00:21:06.041 [2024-07-15 22:49:50.747995] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25f3850 00:21:06.041 [2024-07-15 22:49:50.748085] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:06.041 NewBaseBdev 00:21:06.041 22:49:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:06.041 22:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:06.041 22:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:06.041 22:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:06.041 22:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:06.041 22:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:06.041 22:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:06.300 22:49:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:06.300 [ 00:21:06.300 { 00:21:06.300 "name": "NewBaseBdev", 00:21:06.300 "aliases": [ 00:21:06.300 "24d05e66-9fdd-46d1-af32-bee96216ad97" 00:21:06.300 ], 00:21:06.300 "product_name": "Malloc disk", 00:21:06.300 "block_size": 512, 00:21:06.300 "num_blocks": 65536, 00:21:06.300 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:21:06.300 "assigned_rate_limits": { 00:21:06.300 "rw_ios_per_sec": 0, 00:21:06.300 "rw_mbytes_per_sec": 0, 00:21:06.300 "r_mbytes_per_sec": 0, 00:21:06.300 "w_mbytes_per_sec": 0 00:21:06.300 }, 00:21:06.300 "claimed": true, 00:21:06.300 "claim_type": "exclusive_write", 00:21:06.300 "zoned": false, 00:21:06.300 "supported_io_types": { 00:21:06.300 "read": true, 00:21:06.300 "write": true, 00:21:06.300 "unmap": true, 00:21:06.300 "flush": true, 00:21:06.300 "reset": true, 00:21:06.300 "nvme_admin": false, 00:21:06.300 "nvme_io": false, 00:21:06.300 "nvme_io_md": false, 00:21:06.300 "write_zeroes": true, 00:21:06.300 "zcopy": true, 00:21:06.300 "get_zone_info": false, 00:21:06.300 "zone_management": false, 00:21:06.300 "zone_append": false, 00:21:06.300 "compare": false, 00:21:06.300 "compare_and_write": false, 00:21:06.300 "abort": true, 00:21:06.300 "seek_hole": false, 00:21:06.300 "seek_data": false, 00:21:06.300 "copy": true, 00:21:06.300 "nvme_iov_md": false 00:21:06.300 }, 00:21:06.300 "memory_domains": [ 00:21:06.300 { 00:21:06.300 "dma_device_id": "system", 00:21:06.300 "dma_device_type": 1 00:21:06.300 }, 00:21:06.300 { 00:21:06.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.300 "dma_device_type": 2 00:21:06.300 } 00:21:06.300 ], 00:21:06.300 "driver_specific": {} 00:21:06.300 } 00:21:06.300 ] 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.300 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:06.574 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.574 "name": "Existed_Raid", 00:21:06.574 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:21:06.574 "strip_size_kb": 64, 00:21:06.574 "state": "online", 00:21:06.574 "raid_level": "concat", 00:21:06.574 "superblock": true, 00:21:06.574 "num_base_bdevs": 4, 00:21:06.574 "num_base_bdevs_discovered": 4, 00:21:06.574 "num_base_bdevs_operational": 4, 00:21:06.574 "base_bdevs_list": [ 00:21:06.574 { 00:21:06.574 "name": "NewBaseBdev", 00:21:06.574 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:21:06.574 "is_configured": true, 00:21:06.574 "data_offset": 2048, 00:21:06.574 "data_size": 63488 00:21:06.574 }, 00:21:06.574 { 00:21:06.574 "name": "BaseBdev2", 00:21:06.574 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:21:06.574 "is_configured": true, 00:21:06.574 "data_offset": 2048, 00:21:06.574 "data_size": 63488 00:21:06.574 }, 00:21:06.574 { 00:21:06.574 "name": "BaseBdev3", 00:21:06.574 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:21:06.574 "is_configured": true, 00:21:06.574 "data_offset": 2048, 00:21:06.574 "data_size": 63488 00:21:06.574 }, 00:21:06.574 { 00:21:06.574 "name": "BaseBdev4", 00:21:06.574 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:21:06.574 "is_configured": true, 00:21:06.574 "data_offset": 2048, 00:21:06.574 "data_size": 63488 00:21:06.574 } 00:21:06.574 ] 00:21:06.574 }' 00:21:06.574 22:49:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.574 22:49:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:07.142 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:07.402 [2024-07-15 22:49:52.167605] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:07.402 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:07.402 "name": "Existed_Raid", 00:21:07.402 "aliases": [ 00:21:07.402 "a14fd8b0-483b-4b2a-b754-1873a79aac5d" 00:21:07.402 ], 00:21:07.402 "product_name": "Raid Volume", 00:21:07.402 "block_size": 512, 00:21:07.402 "num_blocks": 253952, 00:21:07.402 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:21:07.402 "assigned_rate_limits": { 00:21:07.402 "rw_ios_per_sec": 0, 00:21:07.402 "rw_mbytes_per_sec": 0, 00:21:07.402 "r_mbytes_per_sec": 0, 00:21:07.402 "w_mbytes_per_sec": 0 00:21:07.402 }, 00:21:07.402 "claimed": false, 00:21:07.402 "zoned": false, 00:21:07.402 "supported_io_types": { 00:21:07.402 "read": true, 00:21:07.402 "write": true, 00:21:07.402 "unmap": true, 00:21:07.402 "flush": true, 00:21:07.402 "reset": true, 00:21:07.402 "nvme_admin": false, 00:21:07.402 "nvme_io": false, 00:21:07.402 "nvme_io_md": false, 00:21:07.402 "write_zeroes": true, 00:21:07.402 "zcopy": false, 00:21:07.402 "get_zone_info": false, 00:21:07.402 "zone_management": false, 00:21:07.402 "zone_append": false, 00:21:07.402 "compare": false, 00:21:07.402 "compare_and_write": false, 00:21:07.402 "abort": false, 00:21:07.402 "seek_hole": false, 00:21:07.402 "seek_data": false, 00:21:07.402 "copy": false, 00:21:07.402 "nvme_iov_md": false 00:21:07.402 }, 00:21:07.402 "memory_domains": [ 00:21:07.402 { 00:21:07.402 "dma_device_id": "system", 00:21:07.402 "dma_device_type": 1 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.402 "dma_device_type": 2 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "dma_device_id": "system", 00:21:07.402 "dma_device_type": 1 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.402 "dma_device_type": 2 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "dma_device_id": "system", 00:21:07.402 "dma_device_type": 1 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.402 "dma_device_type": 2 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "dma_device_id": "system", 00:21:07.402 "dma_device_type": 1 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.402 "dma_device_type": 2 00:21:07.402 } 00:21:07.402 ], 00:21:07.402 "driver_specific": { 00:21:07.402 "raid": { 00:21:07.402 "uuid": "a14fd8b0-483b-4b2a-b754-1873a79aac5d", 00:21:07.402 "strip_size_kb": 64, 00:21:07.402 "state": "online", 00:21:07.402 "raid_level": "concat", 00:21:07.402 "superblock": true, 00:21:07.402 "num_base_bdevs": 4, 00:21:07.402 "num_base_bdevs_discovered": 4, 00:21:07.402 "num_base_bdevs_operational": 4, 00:21:07.402 "base_bdevs_list": [ 00:21:07.402 { 00:21:07.402 "name": "NewBaseBdev", 00:21:07.402 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:21:07.402 "is_configured": true, 00:21:07.402 "data_offset": 2048, 00:21:07.402 "data_size": 63488 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "name": "BaseBdev2", 00:21:07.402 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:21:07.402 "is_configured": true, 00:21:07.402 "data_offset": 2048, 00:21:07.402 "data_size": 63488 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "name": "BaseBdev3", 00:21:07.402 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:21:07.402 "is_configured": true, 00:21:07.402 "data_offset": 2048, 00:21:07.402 "data_size": 63488 00:21:07.402 }, 00:21:07.402 { 00:21:07.402 "name": "BaseBdev4", 00:21:07.402 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:21:07.402 "is_configured": true, 00:21:07.402 "data_offset": 2048, 00:21:07.402 "data_size": 63488 00:21:07.402 } 00:21:07.402 ] 00:21:07.402 } 00:21:07.402 } 00:21:07.402 }' 00:21:07.402 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:07.402 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:07.402 BaseBdev2 00:21:07.402 BaseBdev3 00:21:07.402 BaseBdev4' 00:21:07.402 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:07.402 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:07.402 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:07.662 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:07.662 "name": "NewBaseBdev", 00:21:07.662 "aliases": [ 00:21:07.662 "24d05e66-9fdd-46d1-af32-bee96216ad97" 00:21:07.662 ], 00:21:07.662 "product_name": "Malloc disk", 00:21:07.662 "block_size": 512, 00:21:07.662 "num_blocks": 65536, 00:21:07.662 "uuid": "24d05e66-9fdd-46d1-af32-bee96216ad97", 00:21:07.662 "assigned_rate_limits": { 00:21:07.662 "rw_ios_per_sec": 0, 00:21:07.662 "rw_mbytes_per_sec": 0, 00:21:07.662 "r_mbytes_per_sec": 0, 00:21:07.662 "w_mbytes_per_sec": 0 00:21:07.662 }, 00:21:07.662 "claimed": true, 00:21:07.662 "claim_type": "exclusive_write", 00:21:07.662 "zoned": false, 00:21:07.662 "supported_io_types": { 00:21:07.662 "read": true, 00:21:07.662 "write": true, 00:21:07.662 "unmap": true, 00:21:07.662 "flush": true, 00:21:07.662 "reset": true, 00:21:07.662 "nvme_admin": false, 00:21:07.662 "nvme_io": false, 00:21:07.662 "nvme_io_md": false, 00:21:07.662 "write_zeroes": true, 00:21:07.662 "zcopy": true, 00:21:07.662 "get_zone_info": false, 00:21:07.662 "zone_management": false, 00:21:07.662 "zone_append": false, 00:21:07.662 "compare": false, 00:21:07.662 "compare_and_write": false, 00:21:07.662 "abort": true, 00:21:07.662 "seek_hole": false, 00:21:07.662 "seek_data": false, 00:21:07.662 "copy": true, 00:21:07.662 "nvme_iov_md": false 00:21:07.662 }, 00:21:07.662 "memory_domains": [ 00:21:07.662 { 00:21:07.662 "dma_device_id": "system", 00:21:07.662 "dma_device_type": 1 00:21:07.662 }, 00:21:07.662 { 00:21:07.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.662 "dma_device_type": 2 00:21:07.662 } 00:21:07.662 ], 00:21:07.662 "driver_specific": {} 00:21:07.662 }' 00:21:07.662 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.662 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.662 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:07.662 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:07.920 22:49:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:08.179 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:08.179 "name": "BaseBdev2", 00:21:08.179 "aliases": [ 00:21:08.179 "19aa4243-0dc3-44ea-b1ce-f2726221012b" 00:21:08.179 ], 00:21:08.179 "product_name": "Malloc disk", 00:21:08.179 "block_size": 512, 00:21:08.179 "num_blocks": 65536, 00:21:08.179 "uuid": "19aa4243-0dc3-44ea-b1ce-f2726221012b", 00:21:08.179 "assigned_rate_limits": { 00:21:08.179 "rw_ios_per_sec": 0, 00:21:08.179 "rw_mbytes_per_sec": 0, 00:21:08.179 "r_mbytes_per_sec": 0, 00:21:08.179 "w_mbytes_per_sec": 0 00:21:08.179 }, 00:21:08.179 "claimed": true, 00:21:08.179 "claim_type": "exclusive_write", 00:21:08.179 "zoned": false, 00:21:08.179 "supported_io_types": { 00:21:08.179 "read": true, 00:21:08.179 "write": true, 00:21:08.179 "unmap": true, 00:21:08.179 "flush": true, 00:21:08.179 "reset": true, 00:21:08.179 "nvme_admin": false, 00:21:08.179 "nvme_io": false, 00:21:08.179 "nvme_io_md": false, 00:21:08.179 "write_zeroes": true, 00:21:08.179 "zcopy": true, 00:21:08.179 "get_zone_info": false, 00:21:08.179 "zone_management": false, 00:21:08.179 "zone_append": false, 00:21:08.179 "compare": false, 00:21:08.179 "compare_and_write": false, 00:21:08.179 "abort": true, 00:21:08.179 "seek_hole": false, 00:21:08.179 "seek_data": false, 00:21:08.179 "copy": true, 00:21:08.179 "nvme_iov_md": false 00:21:08.179 }, 00:21:08.179 "memory_domains": [ 00:21:08.179 { 00:21:08.179 "dma_device_id": "system", 00:21:08.179 "dma_device_type": 1 00:21:08.179 }, 00:21:08.179 { 00:21:08.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.179 "dma_device_type": 2 00:21:08.179 } 00:21:08.179 ], 00:21:08.179 "driver_specific": {} 00:21:08.179 }' 00:21:08.179 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:08.437 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.695 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:08.695 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:08.695 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:08.695 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:08.695 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:08.953 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:08.953 "name": "BaseBdev3", 00:21:08.953 "aliases": [ 00:21:08.953 "b2bc2d98-f0cc-4ebe-9250-747e749db0fd" 00:21:08.953 ], 00:21:08.953 "product_name": "Malloc disk", 00:21:08.953 "block_size": 512, 00:21:08.953 "num_blocks": 65536, 00:21:08.953 "uuid": "b2bc2d98-f0cc-4ebe-9250-747e749db0fd", 00:21:08.953 "assigned_rate_limits": { 00:21:08.953 "rw_ios_per_sec": 0, 00:21:08.953 "rw_mbytes_per_sec": 0, 00:21:08.954 "r_mbytes_per_sec": 0, 00:21:08.954 "w_mbytes_per_sec": 0 00:21:08.954 }, 00:21:08.954 "claimed": true, 00:21:08.954 "claim_type": "exclusive_write", 00:21:08.954 "zoned": false, 00:21:08.954 "supported_io_types": { 00:21:08.954 "read": true, 00:21:08.954 "write": true, 00:21:08.954 "unmap": true, 00:21:08.954 "flush": true, 00:21:08.954 "reset": true, 00:21:08.954 "nvme_admin": false, 00:21:08.954 "nvme_io": false, 00:21:08.954 "nvme_io_md": false, 00:21:08.954 "write_zeroes": true, 00:21:08.954 "zcopy": true, 00:21:08.954 "get_zone_info": false, 00:21:08.954 "zone_management": false, 00:21:08.954 "zone_append": false, 00:21:08.954 "compare": false, 00:21:08.954 "compare_and_write": false, 00:21:08.954 "abort": true, 00:21:08.954 "seek_hole": false, 00:21:08.954 "seek_data": false, 00:21:08.954 "copy": true, 00:21:08.954 "nvme_iov_md": false 00:21:08.954 }, 00:21:08.954 "memory_domains": [ 00:21:08.954 { 00:21:08.954 "dma_device_id": "system", 00:21:08.954 "dma_device_type": 1 00:21:08.954 }, 00:21:08.954 { 00:21:08.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.954 "dma_device_type": 2 00:21:08.954 } 00:21:08.954 ], 00:21:08.954 "driver_specific": {} 00:21:08.954 }' 00:21:08.954 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.954 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:08.954 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:08.954 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.954 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:08.954 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:08.954 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.212 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.212 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.212 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.212 22:49:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.212 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:09.212 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:09.212 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:09.212 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:09.471 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:09.471 "name": "BaseBdev4", 00:21:09.471 "aliases": [ 00:21:09.471 "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35" 00:21:09.471 ], 00:21:09.471 "product_name": "Malloc disk", 00:21:09.471 "block_size": 512, 00:21:09.471 "num_blocks": 65536, 00:21:09.471 "uuid": "9c6fc37b-20b2-4a74-9695-eb3bfdc40a35", 00:21:09.471 "assigned_rate_limits": { 00:21:09.471 "rw_ios_per_sec": 0, 00:21:09.471 "rw_mbytes_per_sec": 0, 00:21:09.471 "r_mbytes_per_sec": 0, 00:21:09.471 "w_mbytes_per_sec": 0 00:21:09.471 }, 00:21:09.471 "claimed": true, 00:21:09.471 "claim_type": "exclusive_write", 00:21:09.471 "zoned": false, 00:21:09.471 "supported_io_types": { 00:21:09.471 "read": true, 00:21:09.471 "write": true, 00:21:09.471 "unmap": true, 00:21:09.471 "flush": true, 00:21:09.471 "reset": true, 00:21:09.471 "nvme_admin": false, 00:21:09.471 "nvme_io": false, 00:21:09.471 "nvme_io_md": false, 00:21:09.471 "write_zeroes": true, 00:21:09.471 "zcopy": true, 00:21:09.471 "get_zone_info": false, 00:21:09.471 "zone_management": false, 00:21:09.471 "zone_append": false, 00:21:09.471 "compare": false, 00:21:09.471 "compare_and_write": false, 00:21:09.471 "abort": true, 00:21:09.471 "seek_hole": false, 00:21:09.471 "seek_data": false, 00:21:09.471 "copy": true, 00:21:09.471 "nvme_iov_md": false 00:21:09.471 }, 00:21:09.471 "memory_domains": [ 00:21:09.471 { 00:21:09.471 "dma_device_id": "system", 00:21:09.471 "dma_device_type": 1 00:21:09.471 }, 00:21:09.471 { 00:21:09.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.471 "dma_device_type": 2 00:21:09.471 } 00:21:09.471 ], 00:21:09.471 "driver_specific": {} 00:21:09.471 }' 00:21:09.471 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.471 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.471 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:09.471 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.730 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.730 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:09.730 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.730 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.730 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.730 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.730 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:09.989 [2024-07-15 22:49:54.870479] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:09.989 [2024-07-15 22:49:54.870508] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:09.989 [2024-07-15 22:49:54.870570] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:09.989 [2024-07-15 22:49:54.870634] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:09.989 [2024-07-15 22:49:54.870646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f3850 name Existed_Raid, state offline 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2781316 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2781316 ']' 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2781316 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:09.989 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2781316 00:21:10.248 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:10.248 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:10.248 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2781316' 00:21:10.248 killing process with pid 2781316 00:21:10.248 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2781316 00:21:10.248 [2024-07-15 22:49:54.936999] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:10.248 22:49:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2781316 00:21:10.248 [2024-07-15 22:49:54.980469] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:10.507 22:49:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:10.507 00:21:10.507 real 0m36.819s 00:21:10.507 user 1m7.722s 00:21:10.507 sys 0m6.361s 00:21:10.507 22:49:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:10.507 22:49:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:10.507 ************************************ 00:21:10.507 END TEST raid_state_function_test_sb 00:21:10.507 ************************************ 00:21:10.507 22:49:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:10.507 22:49:55 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:21:10.507 22:49:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:10.507 22:49:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:10.507 22:49:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:10.507 ************************************ 00:21:10.507 START TEST raid_superblock_test 00:21:10.507 ************************************ 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2786710 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2786710 /var/tmp/spdk-raid.sock 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2786710 ']' 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:10.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:10.507 22:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:10.507 [2024-07-15 22:49:55.360436] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:21:10.507 [2024-07-15 22:49:55.360511] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2786710 ] 00:21:10.766 [2024-07-15 22:49:55.493225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:10.766 [2024-07-15 22:49:55.600121] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:10.766 [2024-07-15 22:49:55.659606] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:10.766 [2024-07-15 22:49:55.659633] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:11.701 malloc1 00:21:11.701 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:11.960 [2024-07-15 22:49:56.780304] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:11.960 [2024-07-15 22:49:56.780355] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:11.960 [2024-07-15 22:49:56.780373] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cde570 00:21:11.960 [2024-07-15 22:49:56.780385] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:11.960 [2024-07-15 22:49:56.781973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:11.960 [2024-07-15 22:49:56.782002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:11.960 pt1 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:11.960 22:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:12.218 malloc2 00:21:12.218 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:12.477 [2024-07-15 22:49:57.278353] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:12.477 [2024-07-15 22:49:57.278410] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:12.477 [2024-07-15 22:49:57.278427] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cdf970 00:21:12.477 [2024-07-15 22:49:57.278440] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:12.477 [2024-07-15 22:49:57.280062] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:12.477 [2024-07-15 22:49:57.280093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:12.477 pt2 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:12.477 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:12.735 malloc3 00:21:12.735 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:12.994 [2024-07-15 22:49:57.808444] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:12.994 [2024-07-15 22:49:57.808496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:12.994 [2024-07-15 22:49:57.808513] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e76340 00:21:12.994 [2024-07-15 22:49:57.808525] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:12.994 [2024-07-15 22:49:57.810015] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:12.994 [2024-07-15 22:49:57.810045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:12.994 pt3 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:12.994 22:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:13.253 malloc4 00:21:13.253 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:13.512 [2024-07-15 22:49:58.326512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:13.512 [2024-07-15 22:49:58.326557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:13.512 [2024-07-15 22:49:58.326577] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e78c60 00:21:13.512 [2024-07-15 22:49:58.326589] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:13.512 [2024-07-15 22:49:58.328048] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:13.512 [2024-07-15 22:49:58.328077] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:13.512 pt4 00:21:13.512 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:13.512 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:13.512 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:13.771 [2024-07-15 22:49:58.575189] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:13.771 [2024-07-15 22:49:58.576371] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:13.771 [2024-07-15 22:49:58.576427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:13.771 [2024-07-15 22:49:58.576470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:13.771 [2024-07-15 22:49:58.576633] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd6530 00:21:13.771 [2024-07-15 22:49:58.576644] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:13.771 [2024-07-15 22:49:58.576826] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd4770 00:21:13.771 [2024-07-15 22:49:58.576978] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd6530 00:21:13.771 [2024-07-15 22:49:58.576988] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cd6530 00:21:13.771 [2024-07-15 22:49:58.577080] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.771 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.031 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.031 "name": "raid_bdev1", 00:21:14.031 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:14.031 "strip_size_kb": 64, 00:21:14.031 "state": "online", 00:21:14.031 "raid_level": "concat", 00:21:14.031 "superblock": true, 00:21:14.031 "num_base_bdevs": 4, 00:21:14.031 "num_base_bdevs_discovered": 4, 00:21:14.031 "num_base_bdevs_operational": 4, 00:21:14.031 "base_bdevs_list": [ 00:21:14.031 { 00:21:14.031 "name": "pt1", 00:21:14.031 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:14.031 "is_configured": true, 00:21:14.031 "data_offset": 2048, 00:21:14.031 "data_size": 63488 00:21:14.031 }, 00:21:14.031 { 00:21:14.031 "name": "pt2", 00:21:14.031 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:14.031 "is_configured": true, 00:21:14.031 "data_offset": 2048, 00:21:14.031 "data_size": 63488 00:21:14.031 }, 00:21:14.031 { 00:21:14.031 "name": "pt3", 00:21:14.031 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:14.031 "is_configured": true, 00:21:14.031 "data_offset": 2048, 00:21:14.031 "data_size": 63488 00:21:14.031 }, 00:21:14.031 { 00:21:14.031 "name": "pt4", 00:21:14.031 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:14.031 "is_configured": true, 00:21:14.031 "data_offset": 2048, 00:21:14.031 "data_size": 63488 00:21:14.031 } 00:21:14.031 ] 00:21:14.031 }' 00:21:14.031 22:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.031 22:49:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:14.598 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:14.598 [2024-07-15 22:49:59.493902] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:14.924 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:14.925 "name": "raid_bdev1", 00:21:14.925 "aliases": [ 00:21:14.925 "57f01504-27e2-40e1-9ade-e2563dcc2289" 00:21:14.925 ], 00:21:14.925 "product_name": "Raid Volume", 00:21:14.925 "block_size": 512, 00:21:14.925 "num_blocks": 253952, 00:21:14.925 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:14.925 "assigned_rate_limits": { 00:21:14.925 "rw_ios_per_sec": 0, 00:21:14.925 "rw_mbytes_per_sec": 0, 00:21:14.925 "r_mbytes_per_sec": 0, 00:21:14.925 "w_mbytes_per_sec": 0 00:21:14.925 }, 00:21:14.925 "claimed": false, 00:21:14.925 "zoned": false, 00:21:14.925 "supported_io_types": { 00:21:14.925 "read": true, 00:21:14.925 "write": true, 00:21:14.925 "unmap": true, 00:21:14.925 "flush": true, 00:21:14.925 "reset": true, 00:21:14.925 "nvme_admin": false, 00:21:14.925 "nvme_io": false, 00:21:14.925 "nvme_io_md": false, 00:21:14.925 "write_zeroes": true, 00:21:14.925 "zcopy": false, 00:21:14.925 "get_zone_info": false, 00:21:14.925 "zone_management": false, 00:21:14.925 "zone_append": false, 00:21:14.925 "compare": false, 00:21:14.925 "compare_and_write": false, 00:21:14.925 "abort": false, 00:21:14.925 "seek_hole": false, 00:21:14.925 "seek_data": false, 00:21:14.925 "copy": false, 00:21:14.925 "nvme_iov_md": false 00:21:14.925 }, 00:21:14.925 "memory_domains": [ 00:21:14.925 { 00:21:14.925 "dma_device_id": "system", 00:21:14.925 "dma_device_type": 1 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.925 "dma_device_type": 2 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "system", 00:21:14.925 "dma_device_type": 1 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.925 "dma_device_type": 2 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "system", 00:21:14.925 "dma_device_type": 1 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.925 "dma_device_type": 2 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "system", 00:21:14.925 "dma_device_type": 1 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.925 "dma_device_type": 2 00:21:14.925 } 00:21:14.925 ], 00:21:14.925 "driver_specific": { 00:21:14.925 "raid": { 00:21:14.925 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:14.925 "strip_size_kb": 64, 00:21:14.925 "state": "online", 00:21:14.925 "raid_level": "concat", 00:21:14.925 "superblock": true, 00:21:14.925 "num_base_bdevs": 4, 00:21:14.925 "num_base_bdevs_discovered": 4, 00:21:14.925 "num_base_bdevs_operational": 4, 00:21:14.925 "base_bdevs_list": [ 00:21:14.925 { 00:21:14.925 "name": "pt1", 00:21:14.925 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:14.925 "is_configured": true, 00:21:14.925 "data_offset": 2048, 00:21:14.925 "data_size": 63488 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "name": "pt2", 00:21:14.925 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:14.925 "is_configured": true, 00:21:14.925 "data_offset": 2048, 00:21:14.925 "data_size": 63488 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "name": "pt3", 00:21:14.925 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:14.925 "is_configured": true, 00:21:14.925 "data_offset": 2048, 00:21:14.925 "data_size": 63488 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "name": "pt4", 00:21:14.925 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:14.925 "is_configured": true, 00:21:14.925 "data_offset": 2048, 00:21:14.925 "data_size": 63488 00:21:14.925 } 00:21:14.925 ] 00:21:14.925 } 00:21:14.925 } 00:21:14.925 }' 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:14.925 pt2 00:21:14.925 pt3 00:21:14.925 pt4' 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:14.925 "name": "pt1", 00:21:14.925 "aliases": [ 00:21:14.925 "00000000-0000-0000-0000-000000000001" 00:21:14.925 ], 00:21:14.925 "product_name": "passthru", 00:21:14.925 "block_size": 512, 00:21:14.925 "num_blocks": 65536, 00:21:14.925 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:14.925 "assigned_rate_limits": { 00:21:14.925 "rw_ios_per_sec": 0, 00:21:14.925 "rw_mbytes_per_sec": 0, 00:21:14.925 "r_mbytes_per_sec": 0, 00:21:14.925 "w_mbytes_per_sec": 0 00:21:14.925 }, 00:21:14.925 "claimed": true, 00:21:14.925 "claim_type": "exclusive_write", 00:21:14.925 "zoned": false, 00:21:14.925 "supported_io_types": { 00:21:14.925 "read": true, 00:21:14.925 "write": true, 00:21:14.925 "unmap": true, 00:21:14.925 "flush": true, 00:21:14.925 "reset": true, 00:21:14.925 "nvme_admin": false, 00:21:14.925 "nvme_io": false, 00:21:14.925 "nvme_io_md": false, 00:21:14.925 "write_zeroes": true, 00:21:14.925 "zcopy": true, 00:21:14.925 "get_zone_info": false, 00:21:14.925 "zone_management": false, 00:21:14.925 "zone_append": false, 00:21:14.925 "compare": false, 00:21:14.925 "compare_and_write": false, 00:21:14.925 "abort": true, 00:21:14.925 "seek_hole": false, 00:21:14.925 "seek_data": false, 00:21:14.925 "copy": true, 00:21:14.925 "nvme_iov_md": false 00:21:14.925 }, 00:21:14.925 "memory_domains": [ 00:21:14.925 { 00:21:14.925 "dma_device_id": "system", 00:21:14.925 "dma_device_type": 1 00:21:14.925 }, 00:21:14.925 { 00:21:14.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.925 "dma_device_type": 2 00:21:14.925 } 00:21:14.925 ], 00:21:14.925 "driver_specific": { 00:21:14.925 "passthru": { 00:21:14.925 "name": "pt1", 00:21:14.925 "base_bdev_name": "malloc1" 00:21:14.925 } 00:21:14.925 } 00:21:14.925 }' 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:14.925 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.185 22:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.185 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.185 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.185 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:15.442 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:15.442 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:15.442 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:15.442 "name": "pt2", 00:21:15.442 "aliases": [ 00:21:15.442 "00000000-0000-0000-0000-000000000002" 00:21:15.442 ], 00:21:15.442 "product_name": "passthru", 00:21:15.442 "block_size": 512, 00:21:15.442 "num_blocks": 65536, 00:21:15.442 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:15.442 "assigned_rate_limits": { 00:21:15.442 "rw_ios_per_sec": 0, 00:21:15.442 "rw_mbytes_per_sec": 0, 00:21:15.442 "r_mbytes_per_sec": 0, 00:21:15.442 "w_mbytes_per_sec": 0 00:21:15.442 }, 00:21:15.442 "claimed": true, 00:21:15.442 "claim_type": "exclusive_write", 00:21:15.442 "zoned": false, 00:21:15.442 "supported_io_types": { 00:21:15.442 "read": true, 00:21:15.442 "write": true, 00:21:15.442 "unmap": true, 00:21:15.442 "flush": true, 00:21:15.442 "reset": true, 00:21:15.442 "nvme_admin": false, 00:21:15.442 "nvme_io": false, 00:21:15.442 "nvme_io_md": false, 00:21:15.442 "write_zeroes": true, 00:21:15.442 "zcopy": true, 00:21:15.442 "get_zone_info": false, 00:21:15.442 "zone_management": false, 00:21:15.442 "zone_append": false, 00:21:15.442 "compare": false, 00:21:15.442 "compare_and_write": false, 00:21:15.442 "abort": true, 00:21:15.442 "seek_hole": false, 00:21:15.442 "seek_data": false, 00:21:15.442 "copy": true, 00:21:15.442 "nvme_iov_md": false 00:21:15.442 }, 00:21:15.442 "memory_domains": [ 00:21:15.442 { 00:21:15.442 "dma_device_id": "system", 00:21:15.442 "dma_device_type": 1 00:21:15.442 }, 00:21:15.442 { 00:21:15.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.442 "dma_device_type": 2 00:21:15.442 } 00:21:15.442 ], 00:21:15.442 "driver_specific": { 00:21:15.442 "passthru": { 00:21:15.442 "name": "pt2", 00:21:15.442 "base_bdev_name": "malloc2" 00:21:15.442 } 00:21:15.442 } 00:21:15.442 }' 00:21:15.442 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:15.700 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.958 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:15.958 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:15.958 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:15.959 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:15.959 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:16.217 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:16.217 "name": "pt3", 00:21:16.217 "aliases": [ 00:21:16.217 "00000000-0000-0000-0000-000000000003" 00:21:16.217 ], 00:21:16.217 "product_name": "passthru", 00:21:16.217 "block_size": 512, 00:21:16.217 "num_blocks": 65536, 00:21:16.217 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:16.217 "assigned_rate_limits": { 00:21:16.217 "rw_ios_per_sec": 0, 00:21:16.217 "rw_mbytes_per_sec": 0, 00:21:16.217 "r_mbytes_per_sec": 0, 00:21:16.217 "w_mbytes_per_sec": 0 00:21:16.217 }, 00:21:16.217 "claimed": true, 00:21:16.217 "claim_type": "exclusive_write", 00:21:16.217 "zoned": false, 00:21:16.217 "supported_io_types": { 00:21:16.217 "read": true, 00:21:16.217 "write": true, 00:21:16.217 "unmap": true, 00:21:16.217 "flush": true, 00:21:16.217 "reset": true, 00:21:16.217 "nvme_admin": false, 00:21:16.217 "nvme_io": false, 00:21:16.217 "nvme_io_md": false, 00:21:16.217 "write_zeroes": true, 00:21:16.217 "zcopy": true, 00:21:16.217 "get_zone_info": false, 00:21:16.217 "zone_management": false, 00:21:16.217 "zone_append": false, 00:21:16.217 "compare": false, 00:21:16.217 "compare_and_write": false, 00:21:16.217 "abort": true, 00:21:16.217 "seek_hole": false, 00:21:16.217 "seek_data": false, 00:21:16.217 "copy": true, 00:21:16.217 "nvme_iov_md": false 00:21:16.217 }, 00:21:16.217 "memory_domains": [ 00:21:16.217 { 00:21:16.217 "dma_device_id": "system", 00:21:16.217 "dma_device_type": 1 00:21:16.217 }, 00:21:16.217 { 00:21:16.217 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.217 "dma_device_type": 2 00:21:16.217 } 00:21:16.217 ], 00:21:16.217 "driver_specific": { 00:21:16.217 "passthru": { 00:21:16.217 "name": "pt3", 00:21:16.217 "base_bdev_name": "malloc3" 00:21:16.217 } 00:21:16.217 } 00:21:16.217 }' 00:21:16.217 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.217 22:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.217 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:16.217 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.217 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:16.476 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:16.734 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:16.734 "name": "pt4", 00:21:16.734 "aliases": [ 00:21:16.734 "00000000-0000-0000-0000-000000000004" 00:21:16.734 ], 00:21:16.734 "product_name": "passthru", 00:21:16.734 "block_size": 512, 00:21:16.734 "num_blocks": 65536, 00:21:16.734 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:16.734 "assigned_rate_limits": { 00:21:16.734 "rw_ios_per_sec": 0, 00:21:16.734 "rw_mbytes_per_sec": 0, 00:21:16.734 "r_mbytes_per_sec": 0, 00:21:16.734 "w_mbytes_per_sec": 0 00:21:16.735 }, 00:21:16.735 "claimed": true, 00:21:16.735 "claim_type": "exclusive_write", 00:21:16.735 "zoned": false, 00:21:16.735 "supported_io_types": { 00:21:16.735 "read": true, 00:21:16.735 "write": true, 00:21:16.735 "unmap": true, 00:21:16.735 "flush": true, 00:21:16.735 "reset": true, 00:21:16.735 "nvme_admin": false, 00:21:16.735 "nvme_io": false, 00:21:16.735 "nvme_io_md": false, 00:21:16.735 "write_zeroes": true, 00:21:16.735 "zcopy": true, 00:21:16.735 "get_zone_info": false, 00:21:16.735 "zone_management": false, 00:21:16.735 "zone_append": false, 00:21:16.735 "compare": false, 00:21:16.735 "compare_and_write": false, 00:21:16.735 "abort": true, 00:21:16.735 "seek_hole": false, 00:21:16.735 "seek_data": false, 00:21:16.735 "copy": true, 00:21:16.735 "nvme_iov_md": false 00:21:16.735 }, 00:21:16.735 "memory_domains": [ 00:21:16.735 { 00:21:16.735 "dma_device_id": "system", 00:21:16.735 "dma_device_type": 1 00:21:16.735 }, 00:21:16.735 { 00:21:16.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.735 "dma_device_type": 2 00:21:16.735 } 00:21:16.735 ], 00:21:16.735 "driver_specific": { 00:21:16.735 "passthru": { 00:21:16.735 "name": "pt4", 00:21:16.735 "base_bdev_name": "malloc4" 00:21:16.735 } 00:21:16.735 } 00:21:16.735 }' 00:21:16.735 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:16.993 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:17.251 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:17.251 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:17.252 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:17.252 22:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:17.510 [2024-07-15 22:50:02.177025] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:17.510 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=57f01504-27e2-40e1-9ade-e2563dcc2289 00:21:17.510 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 57f01504-27e2-40e1-9ade-e2563dcc2289 ']' 00:21:17.510 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:17.769 [2024-07-15 22:50:02.437415] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:17.769 [2024-07-15 22:50:02.437436] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:17.769 [2024-07-15 22:50:02.437489] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:17.769 [2024-07-15 22:50:02.437554] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:17.769 [2024-07-15 22:50:02.437566] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd6530 name raid_bdev1, state offline 00:21:17.769 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.769 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:18.027 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:18.027 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:18.027 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:18.027 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:18.286 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:18.286 22:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:18.287 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:18.287 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:18.545 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:18.545 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:18.805 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:18.805 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:19.064 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:19.064 22:50:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:19.064 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:19.064 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:19.064 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:19.064 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:19.064 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:19.323 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:19.323 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:19.323 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:19.323 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:19.323 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:19.323 22:50:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:19.323 [2024-07-15 22:50:04.202066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:19.323 [2024-07-15 22:50:04.203429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:19.323 [2024-07-15 22:50:04.203472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:19.323 [2024-07-15 22:50:04.203506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:19.323 [2024-07-15 22:50:04.203552] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:19.323 [2024-07-15 22:50:04.203594] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:19.323 [2024-07-15 22:50:04.203617] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:19.323 [2024-07-15 22:50:04.203638] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:19.323 [2024-07-15 22:50:04.203655] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:19.323 [2024-07-15 22:50:04.203667] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e81ff0 name raid_bdev1, state configuring 00:21:19.323 request: 00:21:19.323 { 00:21:19.323 "name": "raid_bdev1", 00:21:19.323 "raid_level": "concat", 00:21:19.323 "base_bdevs": [ 00:21:19.323 "malloc1", 00:21:19.323 "malloc2", 00:21:19.323 "malloc3", 00:21:19.323 "malloc4" 00:21:19.323 ], 00:21:19.323 "strip_size_kb": 64, 00:21:19.323 "superblock": false, 00:21:19.323 "method": "bdev_raid_create", 00:21:19.323 "req_id": 1 00:21:19.323 } 00:21:19.323 Got JSON-RPC error response 00:21:19.323 response: 00:21:19.323 { 00:21:19.323 "code": -17, 00:21:19.323 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:19.323 } 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:19.581 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:19.840 [2024-07-15 22:50:04.711341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:19.840 [2024-07-15 22:50:04.711385] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:19.840 [2024-07-15 22:50:04.711405] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cde7a0 00:21:19.840 [2024-07-15 22:50:04.711417] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:19.840 [2024-07-15 22:50:04.713003] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:19.840 [2024-07-15 22:50:04.713030] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:19.840 [2024-07-15 22:50:04.713096] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:19.840 [2024-07-15 22:50:04.713122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:19.840 pt1 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.840 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:20.098 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.098 "name": "raid_bdev1", 00:21:20.098 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:20.098 "strip_size_kb": 64, 00:21:20.098 "state": "configuring", 00:21:20.098 "raid_level": "concat", 00:21:20.098 "superblock": true, 00:21:20.098 "num_base_bdevs": 4, 00:21:20.098 "num_base_bdevs_discovered": 1, 00:21:20.098 "num_base_bdevs_operational": 4, 00:21:20.098 "base_bdevs_list": [ 00:21:20.098 { 00:21:20.098 "name": "pt1", 00:21:20.098 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:20.098 "is_configured": true, 00:21:20.098 "data_offset": 2048, 00:21:20.098 "data_size": 63488 00:21:20.098 }, 00:21:20.098 { 00:21:20.098 "name": null, 00:21:20.098 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:20.098 "is_configured": false, 00:21:20.098 "data_offset": 2048, 00:21:20.098 "data_size": 63488 00:21:20.099 }, 00:21:20.099 { 00:21:20.099 "name": null, 00:21:20.099 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:20.099 "is_configured": false, 00:21:20.099 "data_offset": 2048, 00:21:20.099 "data_size": 63488 00:21:20.099 }, 00:21:20.099 { 00:21:20.099 "name": null, 00:21:20.099 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:20.099 "is_configured": false, 00:21:20.099 "data_offset": 2048, 00:21:20.099 "data_size": 63488 00:21:20.099 } 00:21:20.099 ] 00:21:20.099 }' 00:21:20.099 22:50:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.099 22:50:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.035 22:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:21.035 22:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:21.035 [2024-07-15 22:50:05.858554] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:21.035 [2024-07-15 22:50:05.858605] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:21.035 [2024-07-15 22:50:05.858625] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd5ea0 00:21:21.035 [2024-07-15 22:50:05.858637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:21.035 [2024-07-15 22:50:05.858997] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:21.035 [2024-07-15 22:50:05.859017] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:21.035 [2024-07-15 22:50:05.859084] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:21.035 [2024-07-15 22:50:05.859103] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:21.035 pt2 00:21:21.035 22:50:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:21.294 [2024-07-15 22:50:06.099193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.294 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.553 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.553 "name": "raid_bdev1", 00:21:21.553 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:21.553 "strip_size_kb": 64, 00:21:21.553 "state": "configuring", 00:21:21.553 "raid_level": "concat", 00:21:21.553 "superblock": true, 00:21:21.553 "num_base_bdevs": 4, 00:21:21.553 "num_base_bdevs_discovered": 1, 00:21:21.553 "num_base_bdevs_operational": 4, 00:21:21.553 "base_bdevs_list": [ 00:21:21.553 { 00:21:21.554 "name": "pt1", 00:21:21.554 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:21.554 "is_configured": true, 00:21:21.554 "data_offset": 2048, 00:21:21.554 "data_size": 63488 00:21:21.554 }, 00:21:21.554 { 00:21:21.554 "name": null, 00:21:21.554 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:21.554 "is_configured": false, 00:21:21.554 "data_offset": 2048, 00:21:21.554 "data_size": 63488 00:21:21.554 }, 00:21:21.554 { 00:21:21.554 "name": null, 00:21:21.554 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:21.554 "is_configured": false, 00:21:21.554 "data_offset": 2048, 00:21:21.554 "data_size": 63488 00:21:21.554 }, 00:21:21.554 { 00:21:21.554 "name": null, 00:21:21.554 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:21.554 "is_configured": false, 00:21:21.554 "data_offset": 2048, 00:21:21.554 "data_size": 63488 00:21:21.554 } 00:21:21.554 ] 00:21:21.554 }' 00:21:21.554 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.554 22:50:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:22.120 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:22.121 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:22.121 22:50:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:22.380 [2024-07-15 22:50:07.250248] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:22.380 [2024-07-15 22:50:07.250299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.380 [2024-07-15 22:50:07.250318] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd4ec0 00:21:22.380 [2024-07-15 22:50:07.250331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.380 [2024-07-15 22:50:07.250678] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.380 [2024-07-15 22:50:07.250699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:22.380 [2024-07-15 22:50:07.250763] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:22.380 [2024-07-15 22:50:07.250782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:22.380 pt2 00:21:22.380 22:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:22.380 22:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:22.380 22:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:22.639 [2024-07-15 22:50:07.539021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:22.639 [2024-07-15 22:50:07.539064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.640 [2024-07-15 22:50:07.539081] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cd50f0 00:21:22.640 [2024-07-15 22:50:07.539093] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.640 [2024-07-15 22:50:07.539413] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.640 [2024-07-15 22:50:07.539432] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:22.640 [2024-07-15 22:50:07.539499] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:22.640 [2024-07-15 22:50:07.539518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:22.640 pt3 00:21:22.898 22:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:22.898 22:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:22.898 22:50:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:23.157 [2024-07-15 22:50:08.040359] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:23.157 [2024-07-15 22:50:08.040399] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:23.157 [2024-07-15 22:50:08.040415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cddaf0 00:21:23.157 [2024-07-15 22:50:08.040427] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:23.157 [2024-07-15 22:50:08.040746] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:23.157 [2024-07-15 22:50:08.040765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:23.157 [2024-07-15 22:50:08.040820] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:23.157 [2024-07-15 22:50:08.040838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:23.158 [2024-07-15 22:50:08.040971] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cd78f0 00:21:23.158 [2024-07-15 22:50:08.040982] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:23.158 [2024-07-15 22:50:08.041149] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cd7150 00:21:23.158 [2024-07-15 22:50:08.041280] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cd78f0 00:21:23.158 [2024-07-15 22:50:08.041290] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cd78f0 00:21:23.158 [2024-07-15 22:50:08.041386] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:23.158 pt4 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.416 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:23.984 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.984 "name": "raid_bdev1", 00:21:23.984 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:23.984 "strip_size_kb": 64, 00:21:23.984 "state": "online", 00:21:23.984 "raid_level": "concat", 00:21:23.984 "superblock": true, 00:21:23.984 "num_base_bdevs": 4, 00:21:23.984 "num_base_bdevs_discovered": 4, 00:21:23.984 "num_base_bdevs_operational": 4, 00:21:23.984 "base_bdevs_list": [ 00:21:23.984 { 00:21:23.984 "name": "pt1", 00:21:23.984 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:23.984 "is_configured": true, 00:21:23.984 "data_offset": 2048, 00:21:23.984 "data_size": 63488 00:21:23.984 }, 00:21:23.984 { 00:21:23.984 "name": "pt2", 00:21:23.984 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:23.984 "is_configured": true, 00:21:23.984 "data_offset": 2048, 00:21:23.984 "data_size": 63488 00:21:23.984 }, 00:21:23.984 { 00:21:23.984 "name": "pt3", 00:21:23.984 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:23.984 "is_configured": true, 00:21:23.984 "data_offset": 2048, 00:21:23.984 "data_size": 63488 00:21:23.984 }, 00:21:23.984 { 00:21:23.984 "name": "pt4", 00:21:23.984 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:23.984 "is_configured": true, 00:21:23.984 "data_offset": 2048, 00:21:23.984 "data_size": 63488 00:21:23.984 } 00:21:23.984 ] 00:21:23.984 }' 00:21:23.984 22:50:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.984 22:50:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:24.549 [2024-07-15 22:50:09.336122] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:24.549 "name": "raid_bdev1", 00:21:24.549 "aliases": [ 00:21:24.549 "57f01504-27e2-40e1-9ade-e2563dcc2289" 00:21:24.549 ], 00:21:24.549 "product_name": "Raid Volume", 00:21:24.549 "block_size": 512, 00:21:24.549 "num_blocks": 253952, 00:21:24.549 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:24.549 "assigned_rate_limits": { 00:21:24.549 "rw_ios_per_sec": 0, 00:21:24.549 "rw_mbytes_per_sec": 0, 00:21:24.549 "r_mbytes_per_sec": 0, 00:21:24.549 "w_mbytes_per_sec": 0 00:21:24.549 }, 00:21:24.549 "claimed": false, 00:21:24.549 "zoned": false, 00:21:24.549 "supported_io_types": { 00:21:24.549 "read": true, 00:21:24.549 "write": true, 00:21:24.549 "unmap": true, 00:21:24.549 "flush": true, 00:21:24.549 "reset": true, 00:21:24.549 "nvme_admin": false, 00:21:24.549 "nvme_io": false, 00:21:24.549 "nvme_io_md": false, 00:21:24.549 "write_zeroes": true, 00:21:24.549 "zcopy": false, 00:21:24.549 "get_zone_info": false, 00:21:24.549 "zone_management": false, 00:21:24.549 "zone_append": false, 00:21:24.549 "compare": false, 00:21:24.549 "compare_and_write": false, 00:21:24.549 "abort": false, 00:21:24.549 "seek_hole": false, 00:21:24.549 "seek_data": false, 00:21:24.549 "copy": false, 00:21:24.549 "nvme_iov_md": false 00:21:24.549 }, 00:21:24.549 "memory_domains": [ 00:21:24.549 { 00:21:24.549 "dma_device_id": "system", 00:21:24.549 "dma_device_type": 1 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.549 "dma_device_type": 2 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "dma_device_id": "system", 00:21:24.549 "dma_device_type": 1 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.549 "dma_device_type": 2 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "dma_device_id": "system", 00:21:24.549 "dma_device_type": 1 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.549 "dma_device_type": 2 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "dma_device_id": "system", 00:21:24.549 "dma_device_type": 1 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.549 "dma_device_type": 2 00:21:24.549 } 00:21:24.549 ], 00:21:24.549 "driver_specific": { 00:21:24.549 "raid": { 00:21:24.549 "uuid": "57f01504-27e2-40e1-9ade-e2563dcc2289", 00:21:24.549 "strip_size_kb": 64, 00:21:24.549 "state": "online", 00:21:24.549 "raid_level": "concat", 00:21:24.549 "superblock": true, 00:21:24.549 "num_base_bdevs": 4, 00:21:24.549 "num_base_bdevs_discovered": 4, 00:21:24.549 "num_base_bdevs_operational": 4, 00:21:24.549 "base_bdevs_list": [ 00:21:24.549 { 00:21:24.549 "name": "pt1", 00:21:24.549 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:24.549 "is_configured": true, 00:21:24.549 "data_offset": 2048, 00:21:24.549 "data_size": 63488 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "name": "pt2", 00:21:24.549 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:24.549 "is_configured": true, 00:21:24.549 "data_offset": 2048, 00:21:24.549 "data_size": 63488 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "name": "pt3", 00:21:24.549 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:24.549 "is_configured": true, 00:21:24.549 "data_offset": 2048, 00:21:24.549 "data_size": 63488 00:21:24.549 }, 00:21:24.549 { 00:21:24.549 "name": "pt4", 00:21:24.549 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:24.549 "is_configured": true, 00:21:24.549 "data_offset": 2048, 00:21:24.549 "data_size": 63488 00:21:24.549 } 00:21:24.549 ] 00:21:24.549 } 00:21:24.549 } 00:21:24.549 }' 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:24.549 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:24.549 pt2 00:21:24.549 pt3 00:21:24.549 pt4' 00:21:24.550 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:24.550 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:24.550 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:24.808 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:24.808 "name": "pt1", 00:21:24.808 "aliases": [ 00:21:24.808 "00000000-0000-0000-0000-000000000001" 00:21:24.808 ], 00:21:24.808 "product_name": "passthru", 00:21:24.808 "block_size": 512, 00:21:24.808 "num_blocks": 65536, 00:21:24.808 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:24.808 "assigned_rate_limits": { 00:21:24.808 "rw_ios_per_sec": 0, 00:21:24.808 "rw_mbytes_per_sec": 0, 00:21:24.808 "r_mbytes_per_sec": 0, 00:21:24.808 "w_mbytes_per_sec": 0 00:21:24.808 }, 00:21:24.808 "claimed": true, 00:21:24.808 "claim_type": "exclusive_write", 00:21:24.808 "zoned": false, 00:21:24.808 "supported_io_types": { 00:21:24.808 "read": true, 00:21:24.808 "write": true, 00:21:24.808 "unmap": true, 00:21:24.808 "flush": true, 00:21:24.808 "reset": true, 00:21:24.808 "nvme_admin": false, 00:21:24.808 "nvme_io": false, 00:21:24.808 "nvme_io_md": false, 00:21:24.808 "write_zeroes": true, 00:21:24.808 "zcopy": true, 00:21:24.808 "get_zone_info": false, 00:21:24.808 "zone_management": false, 00:21:24.808 "zone_append": false, 00:21:24.808 "compare": false, 00:21:24.808 "compare_and_write": false, 00:21:24.808 "abort": true, 00:21:24.808 "seek_hole": false, 00:21:24.808 "seek_data": false, 00:21:24.808 "copy": true, 00:21:24.808 "nvme_iov_md": false 00:21:24.808 }, 00:21:24.808 "memory_domains": [ 00:21:24.808 { 00:21:24.808 "dma_device_id": "system", 00:21:24.808 "dma_device_type": 1 00:21:24.808 }, 00:21:24.808 { 00:21:24.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.808 "dma_device_type": 2 00:21:24.808 } 00:21:24.808 ], 00:21:24.808 "driver_specific": { 00:21:24.808 "passthru": { 00:21:24.808 "name": "pt1", 00:21:24.808 "base_bdev_name": "malloc1" 00:21:24.808 } 00:21:24.808 } 00:21:24.808 }' 00:21:24.808 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.808 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.808 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:24.808 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:25.066 22:50:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:25.324 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:25.324 "name": "pt2", 00:21:25.324 "aliases": [ 00:21:25.324 "00000000-0000-0000-0000-000000000002" 00:21:25.324 ], 00:21:25.324 "product_name": "passthru", 00:21:25.324 "block_size": 512, 00:21:25.324 "num_blocks": 65536, 00:21:25.324 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:25.324 "assigned_rate_limits": { 00:21:25.324 "rw_ios_per_sec": 0, 00:21:25.324 "rw_mbytes_per_sec": 0, 00:21:25.324 "r_mbytes_per_sec": 0, 00:21:25.324 "w_mbytes_per_sec": 0 00:21:25.324 }, 00:21:25.324 "claimed": true, 00:21:25.324 "claim_type": "exclusive_write", 00:21:25.324 "zoned": false, 00:21:25.324 "supported_io_types": { 00:21:25.324 "read": true, 00:21:25.324 "write": true, 00:21:25.324 "unmap": true, 00:21:25.324 "flush": true, 00:21:25.324 "reset": true, 00:21:25.324 "nvme_admin": false, 00:21:25.324 "nvme_io": false, 00:21:25.324 "nvme_io_md": false, 00:21:25.324 "write_zeroes": true, 00:21:25.324 "zcopy": true, 00:21:25.324 "get_zone_info": false, 00:21:25.324 "zone_management": false, 00:21:25.324 "zone_append": false, 00:21:25.324 "compare": false, 00:21:25.324 "compare_and_write": false, 00:21:25.324 "abort": true, 00:21:25.324 "seek_hole": false, 00:21:25.324 "seek_data": false, 00:21:25.324 "copy": true, 00:21:25.324 "nvme_iov_md": false 00:21:25.324 }, 00:21:25.324 "memory_domains": [ 00:21:25.324 { 00:21:25.324 "dma_device_id": "system", 00:21:25.324 "dma_device_type": 1 00:21:25.324 }, 00:21:25.324 { 00:21:25.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:25.324 "dma_device_type": 2 00:21:25.324 } 00:21:25.324 ], 00:21:25.324 "driver_specific": { 00:21:25.324 "passthru": { 00:21:25.324 "name": "pt2", 00:21:25.324 "base_bdev_name": "malloc2" 00:21:25.324 } 00:21:25.324 } 00:21:25.324 }' 00:21:25.324 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:25.324 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:25.324 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:25.324 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:25.582 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:25.840 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:25.840 "name": "pt3", 00:21:25.840 "aliases": [ 00:21:25.840 "00000000-0000-0000-0000-000000000003" 00:21:25.840 ], 00:21:25.840 "product_name": "passthru", 00:21:25.840 "block_size": 512, 00:21:25.840 "num_blocks": 65536, 00:21:25.840 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:25.840 "assigned_rate_limits": { 00:21:25.840 "rw_ios_per_sec": 0, 00:21:25.840 "rw_mbytes_per_sec": 0, 00:21:25.840 "r_mbytes_per_sec": 0, 00:21:25.840 "w_mbytes_per_sec": 0 00:21:25.840 }, 00:21:25.840 "claimed": true, 00:21:25.840 "claim_type": "exclusive_write", 00:21:25.840 "zoned": false, 00:21:25.840 "supported_io_types": { 00:21:25.840 "read": true, 00:21:25.840 "write": true, 00:21:25.840 "unmap": true, 00:21:25.840 "flush": true, 00:21:25.840 "reset": true, 00:21:25.840 "nvme_admin": false, 00:21:25.840 "nvme_io": false, 00:21:25.840 "nvme_io_md": false, 00:21:25.840 "write_zeroes": true, 00:21:25.840 "zcopy": true, 00:21:25.840 "get_zone_info": false, 00:21:25.840 "zone_management": false, 00:21:25.840 "zone_append": false, 00:21:25.840 "compare": false, 00:21:25.840 "compare_and_write": false, 00:21:25.840 "abort": true, 00:21:25.840 "seek_hole": false, 00:21:25.840 "seek_data": false, 00:21:25.840 "copy": true, 00:21:25.840 "nvme_iov_md": false 00:21:25.840 }, 00:21:25.840 "memory_domains": [ 00:21:25.840 { 00:21:25.840 "dma_device_id": "system", 00:21:25.840 "dma_device_type": 1 00:21:25.840 }, 00:21:25.840 { 00:21:25.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:25.840 "dma_device_type": 2 00:21:25.840 } 00:21:25.840 ], 00:21:25.840 "driver_specific": { 00:21:25.840 "passthru": { 00:21:25.840 "name": "pt3", 00:21:25.840 "base_bdev_name": "malloc3" 00:21:25.840 } 00:21:25.840 } 00:21:25.840 }' 00:21:25.840 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:25.840 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:25.840 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:25.840 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.840 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.098 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.098 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.098 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.098 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.098 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.098 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.099 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:26.099 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:26.099 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:26.099 22:50:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:26.356 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:26.356 "name": "pt4", 00:21:26.356 "aliases": [ 00:21:26.356 "00000000-0000-0000-0000-000000000004" 00:21:26.356 ], 00:21:26.356 "product_name": "passthru", 00:21:26.356 "block_size": 512, 00:21:26.356 "num_blocks": 65536, 00:21:26.356 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:26.356 "assigned_rate_limits": { 00:21:26.356 "rw_ios_per_sec": 0, 00:21:26.356 "rw_mbytes_per_sec": 0, 00:21:26.356 "r_mbytes_per_sec": 0, 00:21:26.356 "w_mbytes_per_sec": 0 00:21:26.356 }, 00:21:26.356 "claimed": true, 00:21:26.356 "claim_type": "exclusive_write", 00:21:26.356 "zoned": false, 00:21:26.356 "supported_io_types": { 00:21:26.356 "read": true, 00:21:26.356 "write": true, 00:21:26.356 "unmap": true, 00:21:26.356 "flush": true, 00:21:26.356 "reset": true, 00:21:26.356 "nvme_admin": false, 00:21:26.356 "nvme_io": false, 00:21:26.356 "nvme_io_md": false, 00:21:26.356 "write_zeroes": true, 00:21:26.356 "zcopy": true, 00:21:26.356 "get_zone_info": false, 00:21:26.356 "zone_management": false, 00:21:26.356 "zone_append": false, 00:21:26.356 "compare": false, 00:21:26.356 "compare_and_write": false, 00:21:26.356 "abort": true, 00:21:26.356 "seek_hole": false, 00:21:26.356 "seek_data": false, 00:21:26.356 "copy": true, 00:21:26.356 "nvme_iov_md": false 00:21:26.356 }, 00:21:26.356 "memory_domains": [ 00:21:26.356 { 00:21:26.356 "dma_device_id": "system", 00:21:26.356 "dma_device_type": 1 00:21:26.356 }, 00:21:26.356 { 00:21:26.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:26.356 "dma_device_type": 2 00:21:26.356 } 00:21:26.356 ], 00:21:26.356 "driver_specific": { 00:21:26.356 "passthru": { 00:21:26.356 "name": "pt4", 00:21:26.356 "base_bdev_name": "malloc4" 00:21:26.356 } 00:21:26.356 } 00:21:26.356 }' 00:21:26.356 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.356 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.614 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:26.872 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:26.872 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:26.872 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:26.872 [2024-07-15 22:50:11.770606] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 57f01504-27e2-40e1-9ade-e2563dcc2289 '!=' 57f01504-27e2-40e1-9ade-e2563dcc2289 ']' 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2786710 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2786710 ']' 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2786710 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2786710 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2786710' 00:21:27.130 killing process with pid 2786710 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2786710 00:21:27.130 [2024-07-15 22:50:11.839607] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:27.130 [2024-07-15 22:50:11.839674] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:27.130 [2024-07-15 22:50:11.839736] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:27.130 [2024-07-15 22:50:11.839748] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cd78f0 name raid_bdev1, state offline 00:21:27.130 22:50:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2786710 00:21:27.130 [2024-07-15 22:50:11.883345] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:27.389 22:50:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:27.389 00:21:27.389 real 0m16.816s 00:21:27.389 user 0m30.262s 00:21:27.389 sys 0m3.064s 00:21:27.389 22:50:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:27.389 22:50:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:27.389 ************************************ 00:21:27.389 END TEST raid_superblock_test 00:21:27.389 ************************************ 00:21:27.389 22:50:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:27.389 22:50:12 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:21:27.389 22:50:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:27.389 22:50:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:27.389 22:50:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:27.389 ************************************ 00:21:27.389 START TEST raid_read_error_test 00:21:27.389 ************************************ 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.wMTsPA5CMI 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2789185 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2789185 /var/tmp/spdk-raid.sock 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2789185 ']' 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:27.390 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:27.390 22:50:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:27.390 [2024-07-15 22:50:12.282274] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:21:27.390 [2024-07-15 22:50:12.282348] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2789185 ] 00:21:27.649 [2024-07-15 22:50:12.414432] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:27.649 [2024-07-15 22:50:12.523732] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:27.908 [2024-07-15 22:50:12.584780] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:27.908 [2024-07-15 22:50:12.584809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:28.473 22:50:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:28.473 22:50:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:28.473 22:50:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:28.473 22:50:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:28.732 BaseBdev1_malloc 00:21:28.732 22:50:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:28.990 true 00:21:28.990 22:50:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:29.248 [2024-07-15 22:50:13.970413] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:29.248 [2024-07-15 22:50:13.970458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:29.248 [2024-07-15 22:50:13.970478] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f20d0 00:21:29.248 [2024-07-15 22:50:13.970491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:29.248 [2024-07-15 22:50:13.972403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:29.248 [2024-07-15 22:50:13.972435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:29.248 BaseBdev1 00:21:29.248 22:50:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:29.248 22:50:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:29.542 BaseBdev2_malloc 00:21:29.542 22:50:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:29.813 true 00:21:29.813 22:50:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:29.813 [2024-07-15 22:50:14.710177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:29.813 [2024-07-15 22:50:14.710225] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:29.813 [2024-07-15 22:50:14.710246] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f6910 00:21:29.813 [2024-07-15 22:50:14.710259] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:29.813 [2024-07-15 22:50:14.711853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:29.813 [2024-07-15 22:50:14.711882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:29.813 BaseBdev2 00:21:30.070 22:50:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:30.070 22:50:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:30.070 BaseBdev3_malloc 00:21:30.330 22:50:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:30.330 true 00:21:30.330 22:50:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:30.590 [2024-07-15 22:50:15.444975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:30.590 [2024-07-15 22:50:15.445022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.590 [2024-07-15 22:50:15.445042] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f8bd0 00:21:30.590 [2024-07-15 22:50:15.445055] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.590 [2024-07-15 22:50:15.446614] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.590 [2024-07-15 22:50:15.446644] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:30.590 BaseBdev3 00:21:30.590 22:50:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:30.590 22:50:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:30.849 BaseBdev4_malloc 00:21:30.849 22:50:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:31.108 true 00:21:31.108 22:50:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:31.367 [2024-07-15 22:50:16.168705] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:31.367 [2024-07-15 22:50:16.168750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.367 [2024-07-15 22:50:16.168772] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19f9aa0 00:21:31.367 [2024-07-15 22:50:16.168785] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.367 [2024-07-15 22:50:16.170397] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.367 [2024-07-15 22:50:16.170426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:31.367 BaseBdev4 00:21:31.367 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:31.626 [2024-07-15 22:50:16.401371] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:31.626 [2024-07-15 22:50:16.402721] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:31.626 [2024-07-15 22:50:16.402793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:31.626 [2024-07-15 22:50:16.402854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:31.626 [2024-07-15 22:50:16.403103] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19f3c20 00:21:31.626 [2024-07-15 22:50:16.403114] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:31.626 [2024-07-15 22:50:16.403317] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1848260 00:21:31.626 [2024-07-15 22:50:16.403473] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19f3c20 00:21:31.626 [2024-07-15 22:50:16.403483] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19f3c20 00:21:31.626 [2024-07-15 22:50:16.403593] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.626 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.886 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.886 "name": "raid_bdev1", 00:21:31.886 "uuid": "a1588457-fdd2-41f0-9aaa-030837b8ab7b", 00:21:31.886 "strip_size_kb": 64, 00:21:31.886 "state": "online", 00:21:31.886 "raid_level": "concat", 00:21:31.886 "superblock": true, 00:21:31.886 "num_base_bdevs": 4, 00:21:31.886 "num_base_bdevs_discovered": 4, 00:21:31.886 "num_base_bdevs_operational": 4, 00:21:31.886 "base_bdevs_list": [ 00:21:31.886 { 00:21:31.886 "name": "BaseBdev1", 00:21:31.886 "uuid": "89b1f0df-4683-58b7-bfd6-8fbdbe233056", 00:21:31.886 "is_configured": true, 00:21:31.886 "data_offset": 2048, 00:21:31.886 "data_size": 63488 00:21:31.886 }, 00:21:31.886 { 00:21:31.886 "name": "BaseBdev2", 00:21:31.886 "uuid": "8b27df8e-8ce9-5ad6-b470-6473577b4c0f", 00:21:31.886 "is_configured": true, 00:21:31.886 "data_offset": 2048, 00:21:31.886 "data_size": 63488 00:21:31.886 }, 00:21:31.886 { 00:21:31.886 "name": "BaseBdev3", 00:21:31.886 "uuid": "fd21e357-1d80-54d8-9434-0650d22d36fa", 00:21:31.886 "is_configured": true, 00:21:31.886 "data_offset": 2048, 00:21:31.886 "data_size": 63488 00:21:31.886 }, 00:21:31.886 { 00:21:31.886 "name": "BaseBdev4", 00:21:31.886 "uuid": "d5eb574c-8004-552c-a972-c19a373cf7ab", 00:21:31.886 "is_configured": true, 00:21:31.886 "data_offset": 2048, 00:21:31.886 "data_size": 63488 00:21:31.886 } 00:21:31.886 ] 00:21:31.886 }' 00:21:31.886 22:50:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.886 22:50:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.453 22:50:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:32.453 22:50:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:32.712 [2024-07-15 22:50:17.372229] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19e5fc0 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.650 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.909 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.909 "name": "raid_bdev1", 00:21:33.909 "uuid": "a1588457-fdd2-41f0-9aaa-030837b8ab7b", 00:21:33.909 "strip_size_kb": 64, 00:21:33.909 "state": "online", 00:21:33.909 "raid_level": "concat", 00:21:33.909 "superblock": true, 00:21:33.909 "num_base_bdevs": 4, 00:21:33.909 "num_base_bdevs_discovered": 4, 00:21:33.909 "num_base_bdevs_operational": 4, 00:21:33.909 "base_bdevs_list": [ 00:21:33.909 { 00:21:33.909 "name": "BaseBdev1", 00:21:33.909 "uuid": "89b1f0df-4683-58b7-bfd6-8fbdbe233056", 00:21:33.909 "is_configured": true, 00:21:33.909 "data_offset": 2048, 00:21:33.909 "data_size": 63488 00:21:33.909 }, 00:21:33.909 { 00:21:33.909 "name": "BaseBdev2", 00:21:33.909 "uuid": "8b27df8e-8ce9-5ad6-b470-6473577b4c0f", 00:21:33.909 "is_configured": true, 00:21:33.909 "data_offset": 2048, 00:21:33.909 "data_size": 63488 00:21:33.909 }, 00:21:33.909 { 00:21:33.909 "name": "BaseBdev3", 00:21:33.909 "uuid": "fd21e357-1d80-54d8-9434-0650d22d36fa", 00:21:33.909 "is_configured": true, 00:21:33.909 "data_offset": 2048, 00:21:33.909 "data_size": 63488 00:21:33.909 }, 00:21:33.909 { 00:21:33.909 "name": "BaseBdev4", 00:21:33.909 "uuid": "d5eb574c-8004-552c-a972-c19a373cf7ab", 00:21:33.909 "is_configured": true, 00:21:33.909 "data_offset": 2048, 00:21:33.909 "data_size": 63488 00:21:33.909 } 00:21:33.909 ] 00:21:33.909 }' 00:21:33.909 22:50:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.909 22:50:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:34.478 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:34.737 [2024-07-15 22:50:19.605598] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:34.737 [2024-07-15 22:50:19.605640] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:34.737 [2024-07-15 22:50:19.608808] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:34.737 [2024-07-15 22:50:19.608848] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:34.737 [2024-07-15 22:50:19.608889] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:34.737 [2024-07-15 22:50:19.608900] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19f3c20 name raid_bdev1, state offline 00:21:34.737 0 00:21:34.737 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2789185 00:21:34.737 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2789185 ']' 00:21:34.737 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2789185 00:21:34.737 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:21:34.737 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:34.737 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2789185 00:21:34.997 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:34.997 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:34.997 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2789185' 00:21:34.997 killing process with pid 2789185 00:21:34.997 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2789185 00:21:34.997 [2024-07-15 22:50:19.680247] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:34.997 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2789185 00:21:34.997 [2024-07-15 22:50:19.712816] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.wMTsPA5CMI 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:21:35.256 00:21:35.256 real 0m7.752s 00:21:35.256 user 0m12.394s 00:21:35.256 sys 0m1.405s 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:35.256 22:50:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:35.256 ************************************ 00:21:35.256 END TEST raid_read_error_test 00:21:35.256 ************************************ 00:21:35.256 22:50:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:35.256 22:50:19 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:21:35.256 22:50:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:35.256 22:50:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:35.257 22:50:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:35.257 ************************************ 00:21:35.257 START TEST raid_write_error_test 00:21:35.257 ************************************ 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.0gUWOcsM4i 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2790292 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2790292 /var/tmp/spdk-raid.sock 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2790292 ']' 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:35.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:35.257 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:35.257 [2024-07-15 22:50:20.165127] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:21:35.257 [2024-07-15 22:50:20.165270] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2790292 ] 00:21:35.516 [2024-07-15 22:50:20.361372] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:35.775 [2024-07-15 22:50:20.468051] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:35.775 [2024-07-15 22:50:20.524524] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:35.775 [2024-07-15 22:50:20.524555] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:35.775 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:35.775 22:50:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:35.775 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:35.775 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:36.035 BaseBdev1_malloc 00:21:36.035 22:50:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:36.293 true 00:21:36.294 22:50:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:36.553 [2024-07-15 22:50:21.295313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:36.553 [2024-07-15 22:50:21.295358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:36.553 [2024-07-15 22:50:21.295382] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19140d0 00:21:36.553 [2024-07-15 22:50:21.295395] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:36.553 [2024-07-15 22:50:21.297304] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:36.553 [2024-07-15 22:50:21.297338] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:36.553 BaseBdev1 00:21:36.553 22:50:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:36.553 22:50:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:36.812 BaseBdev2_malloc 00:21:36.812 22:50:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:37.071 true 00:21:37.071 22:50:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:37.330 [2024-07-15 22:50:22.031102] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:37.330 [2024-07-15 22:50:22.031147] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:37.330 [2024-07-15 22:50:22.031169] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1918910 00:21:37.330 [2024-07-15 22:50:22.031182] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:37.330 [2024-07-15 22:50:22.032733] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:37.331 [2024-07-15 22:50:22.032764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:37.331 BaseBdev2 00:21:37.331 22:50:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:37.331 22:50:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:37.590 BaseBdev3_malloc 00:21:37.590 22:50:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:37.848 true 00:21:37.848 22:50:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:38.107 [2024-07-15 22:50:22.769606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:38.107 [2024-07-15 22:50:22.769651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.107 [2024-07-15 22:50:22.769674] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191abd0 00:21:38.107 [2024-07-15 22:50:22.769687] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.107 [2024-07-15 22:50:22.771294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.107 [2024-07-15 22:50:22.771324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:38.107 BaseBdev3 00:21:38.107 22:50:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:38.107 22:50:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:38.107 BaseBdev4_malloc 00:21:38.365 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:38.365 true 00:21:38.365 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:38.623 [2024-07-15 22:50:23.497307] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:38.623 [2024-07-15 22:50:23.497354] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:38.623 [2024-07-15 22:50:23.497378] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x191baa0 00:21:38.623 [2024-07-15 22:50:23.497391] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:38.623 [2024-07-15 22:50:23.499007] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:38.623 [2024-07-15 22:50:23.499037] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:38.623 BaseBdev4 00:21:38.623 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:38.881 [2024-07-15 22:50:23.738000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:38.882 [2024-07-15 22:50:23.739321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:38.882 [2024-07-15 22:50:23.739389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:38.882 [2024-07-15 22:50:23.739450] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:38.882 [2024-07-15 22:50:23.739679] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1915c20 00:21:38.882 [2024-07-15 22:50:23.739690] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:38.882 [2024-07-15 22:50:23.739895] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x176a260 00:21:38.882 [2024-07-15 22:50:23.740052] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1915c20 00:21:38.882 [2024-07-15 22:50:23.740063] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1915c20 00:21:38.882 [2024-07-15 22:50:23.740168] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.882 22:50:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.140 22:50:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.140 "name": "raid_bdev1", 00:21:39.140 "uuid": "3a3879c2-2e4a-4796-a536-f174f299241e", 00:21:39.140 "strip_size_kb": 64, 00:21:39.140 "state": "online", 00:21:39.140 "raid_level": "concat", 00:21:39.140 "superblock": true, 00:21:39.140 "num_base_bdevs": 4, 00:21:39.140 "num_base_bdevs_discovered": 4, 00:21:39.140 "num_base_bdevs_operational": 4, 00:21:39.140 "base_bdevs_list": [ 00:21:39.140 { 00:21:39.140 "name": "BaseBdev1", 00:21:39.140 "uuid": "aaa507a8-f3a7-5de3-b831-d48c97e20c1a", 00:21:39.140 "is_configured": true, 00:21:39.140 "data_offset": 2048, 00:21:39.140 "data_size": 63488 00:21:39.140 }, 00:21:39.140 { 00:21:39.140 "name": "BaseBdev2", 00:21:39.140 "uuid": "0b024b31-8de1-5525-aca6-da8cda7dec3b", 00:21:39.140 "is_configured": true, 00:21:39.140 "data_offset": 2048, 00:21:39.140 "data_size": 63488 00:21:39.140 }, 00:21:39.140 { 00:21:39.140 "name": "BaseBdev3", 00:21:39.140 "uuid": "b8179863-af0a-5e87-8ba8-8043720fbe53", 00:21:39.140 "is_configured": true, 00:21:39.140 "data_offset": 2048, 00:21:39.140 "data_size": 63488 00:21:39.140 }, 00:21:39.140 { 00:21:39.140 "name": "BaseBdev4", 00:21:39.140 "uuid": "f8c67edf-fc8f-5515-b3a4-6f3a3c0ed181", 00:21:39.140 "is_configured": true, 00:21:39.140 "data_offset": 2048, 00:21:39.140 "data_size": 63488 00:21:39.140 } 00:21:39.140 ] 00:21:39.140 }' 00:21:39.140 22:50:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.140 22:50:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.707 22:50:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:39.707 22:50:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:39.966 [2024-07-15 22:50:24.704840] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1907fc0 00:21:40.904 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.163 22:50:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.421 22:50:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.421 "name": "raid_bdev1", 00:21:41.421 "uuid": "3a3879c2-2e4a-4796-a536-f174f299241e", 00:21:41.421 "strip_size_kb": 64, 00:21:41.421 "state": "online", 00:21:41.421 "raid_level": "concat", 00:21:41.421 "superblock": true, 00:21:41.421 "num_base_bdevs": 4, 00:21:41.421 "num_base_bdevs_discovered": 4, 00:21:41.421 "num_base_bdevs_operational": 4, 00:21:41.421 "base_bdevs_list": [ 00:21:41.421 { 00:21:41.421 "name": "BaseBdev1", 00:21:41.421 "uuid": "aaa507a8-f3a7-5de3-b831-d48c97e20c1a", 00:21:41.421 "is_configured": true, 00:21:41.421 "data_offset": 2048, 00:21:41.421 "data_size": 63488 00:21:41.421 }, 00:21:41.421 { 00:21:41.421 "name": "BaseBdev2", 00:21:41.421 "uuid": "0b024b31-8de1-5525-aca6-da8cda7dec3b", 00:21:41.421 "is_configured": true, 00:21:41.421 "data_offset": 2048, 00:21:41.421 "data_size": 63488 00:21:41.421 }, 00:21:41.421 { 00:21:41.421 "name": "BaseBdev3", 00:21:41.421 "uuid": "b8179863-af0a-5e87-8ba8-8043720fbe53", 00:21:41.421 "is_configured": true, 00:21:41.421 "data_offset": 2048, 00:21:41.421 "data_size": 63488 00:21:41.421 }, 00:21:41.421 { 00:21:41.421 "name": "BaseBdev4", 00:21:41.421 "uuid": "f8c67edf-fc8f-5515-b3a4-6f3a3c0ed181", 00:21:41.421 "is_configured": true, 00:21:41.421 "data_offset": 2048, 00:21:41.421 "data_size": 63488 00:21:41.421 } 00:21:41.421 ] 00:21:41.421 }' 00:21:41.421 22:50:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.421 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:41.987 22:50:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:42.246 [2024-07-15 22:50:26.897397] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:42.246 [2024-07-15 22:50:26.897430] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:42.246 [2024-07-15 22:50:26.900595] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:42.246 [2024-07-15 22:50:26.900634] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:42.246 [2024-07-15 22:50:26.900675] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:42.246 [2024-07-15 22:50:26.900686] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1915c20 name raid_bdev1, state offline 00:21:42.246 0 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2790292 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2790292 ']' 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2790292 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2790292 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2790292' 00:21:42.246 killing process with pid 2790292 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2790292 00:21:42.246 [2024-07-15 22:50:26.971229] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:42.246 22:50:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2790292 00:21:42.246 [2024-07-15 22:50:27.003281] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.0gUWOcsM4i 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:21:42.505 00:21:42.505 real 0m7.198s 00:21:42.505 user 0m11.756s 00:21:42.505 sys 0m1.425s 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:42.505 22:50:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.505 ************************************ 00:21:42.505 END TEST raid_write_error_test 00:21:42.505 ************************************ 00:21:42.505 22:50:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:42.505 22:50:27 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:42.505 22:50:27 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:42.505 22:50:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:42.505 22:50:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:42.505 22:50:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:42.505 ************************************ 00:21:42.505 START TEST raid_state_function_test 00:21:42.505 ************************************ 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2791317 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2791317' 00:21:42.505 Process raid pid: 2791317 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2791317 /var/tmp/spdk-raid.sock 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2791317 ']' 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:42.505 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:42.505 22:50:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.505 [2024-07-15 22:50:27.393655] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:21:42.505 [2024-07-15 22:50:27.393726] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:42.764 [2024-07-15 22:50:27.525641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:42.764 [2024-07-15 22:50:27.636512] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:43.022 [2024-07-15 22:50:27.700456] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.022 [2024-07-15 22:50:27.700483] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:43.595 22:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:43.595 22:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:21:43.595 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:43.892 [2024-07-15 22:50:28.567082] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:43.892 [2024-07-15 22:50:28.567123] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:43.892 [2024-07-15 22:50:28.567133] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:43.892 [2024-07-15 22:50:28.567145] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:43.892 [2024-07-15 22:50:28.567154] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:43.892 [2024-07-15 22:50:28.567165] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:43.892 [2024-07-15 22:50:28.567174] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:43.892 [2024-07-15 22:50:28.567185] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.892 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.166 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.166 "name": "Existed_Raid", 00:21:44.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.166 "strip_size_kb": 0, 00:21:44.166 "state": "configuring", 00:21:44.166 "raid_level": "raid1", 00:21:44.166 "superblock": false, 00:21:44.166 "num_base_bdevs": 4, 00:21:44.166 "num_base_bdevs_discovered": 0, 00:21:44.166 "num_base_bdevs_operational": 4, 00:21:44.166 "base_bdevs_list": [ 00:21:44.166 { 00:21:44.166 "name": "BaseBdev1", 00:21:44.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.166 "is_configured": false, 00:21:44.166 "data_offset": 0, 00:21:44.166 "data_size": 0 00:21:44.166 }, 00:21:44.166 { 00:21:44.166 "name": "BaseBdev2", 00:21:44.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.166 "is_configured": false, 00:21:44.166 "data_offset": 0, 00:21:44.166 "data_size": 0 00:21:44.166 }, 00:21:44.166 { 00:21:44.166 "name": "BaseBdev3", 00:21:44.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.166 "is_configured": false, 00:21:44.166 "data_offset": 0, 00:21:44.166 "data_size": 0 00:21:44.166 }, 00:21:44.166 { 00:21:44.166 "name": "BaseBdev4", 00:21:44.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.166 "is_configured": false, 00:21:44.167 "data_offset": 0, 00:21:44.167 "data_size": 0 00:21:44.167 } 00:21:44.167 ] 00:21:44.167 }' 00:21:44.167 22:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.167 22:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:44.734 22:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:44.992 [2024-07-15 22:50:29.657810] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:44.992 [2024-07-15 22:50:29.657840] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2563aa0 name Existed_Raid, state configuring 00:21:44.992 22:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:44.992 [2024-07-15 22:50:29.898468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:44.992 [2024-07-15 22:50:29.898496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:44.992 [2024-07-15 22:50:29.898505] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:44.992 [2024-07-15 22:50:29.898516] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:44.992 [2024-07-15 22:50:29.898525] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:44.992 [2024-07-15 22:50:29.898536] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:44.992 [2024-07-15 22:50:29.898545] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:44.992 [2024-07-15 22:50:29.898556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:45.250 22:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:45.250 [2024-07-15 22:50:30.149003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:45.250 BaseBdev1 00:21:45.509 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:45.509 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:45.509 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:45.509 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:45.509 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:45.509 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:45.509 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:45.767 [ 00:21:45.767 { 00:21:45.767 "name": "BaseBdev1", 00:21:45.767 "aliases": [ 00:21:45.767 "c7920377-d305-4b8f-833f-f0839d1fa799" 00:21:45.767 ], 00:21:45.767 "product_name": "Malloc disk", 00:21:45.767 "block_size": 512, 00:21:45.767 "num_blocks": 65536, 00:21:45.767 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:45.767 "assigned_rate_limits": { 00:21:45.767 "rw_ios_per_sec": 0, 00:21:45.767 "rw_mbytes_per_sec": 0, 00:21:45.767 "r_mbytes_per_sec": 0, 00:21:45.767 "w_mbytes_per_sec": 0 00:21:45.767 }, 00:21:45.767 "claimed": true, 00:21:45.767 "claim_type": "exclusive_write", 00:21:45.767 "zoned": false, 00:21:45.767 "supported_io_types": { 00:21:45.767 "read": true, 00:21:45.767 "write": true, 00:21:45.767 "unmap": true, 00:21:45.767 "flush": true, 00:21:45.767 "reset": true, 00:21:45.767 "nvme_admin": false, 00:21:45.767 "nvme_io": false, 00:21:45.767 "nvme_io_md": false, 00:21:45.767 "write_zeroes": true, 00:21:45.767 "zcopy": true, 00:21:45.767 "get_zone_info": false, 00:21:45.767 "zone_management": false, 00:21:45.767 "zone_append": false, 00:21:45.767 "compare": false, 00:21:45.767 "compare_and_write": false, 00:21:45.767 "abort": true, 00:21:45.767 "seek_hole": false, 00:21:45.767 "seek_data": false, 00:21:45.767 "copy": true, 00:21:45.767 "nvme_iov_md": false 00:21:45.767 }, 00:21:45.767 "memory_domains": [ 00:21:45.767 { 00:21:45.767 "dma_device_id": "system", 00:21:45.767 "dma_device_type": 1 00:21:45.767 }, 00:21:45.767 { 00:21:45.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.767 "dma_device_type": 2 00:21:45.767 } 00:21:45.767 ], 00:21:45.767 "driver_specific": {} 00:21:45.767 } 00:21:45.767 ] 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.767 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.025 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.025 "name": "Existed_Raid", 00:21:46.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.025 "strip_size_kb": 0, 00:21:46.025 "state": "configuring", 00:21:46.025 "raid_level": "raid1", 00:21:46.025 "superblock": false, 00:21:46.025 "num_base_bdevs": 4, 00:21:46.025 "num_base_bdevs_discovered": 1, 00:21:46.025 "num_base_bdevs_operational": 4, 00:21:46.025 "base_bdevs_list": [ 00:21:46.025 { 00:21:46.025 "name": "BaseBdev1", 00:21:46.025 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:46.025 "is_configured": true, 00:21:46.025 "data_offset": 0, 00:21:46.025 "data_size": 65536 00:21:46.025 }, 00:21:46.025 { 00:21:46.025 "name": "BaseBdev2", 00:21:46.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.026 "is_configured": false, 00:21:46.026 "data_offset": 0, 00:21:46.026 "data_size": 0 00:21:46.026 }, 00:21:46.026 { 00:21:46.026 "name": "BaseBdev3", 00:21:46.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.026 "is_configured": false, 00:21:46.026 "data_offset": 0, 00:21:46.026 "data_size": 0 00:21:46.026 }, 00:21:46.026 { 00:21:46.026 "name": "BaseBdev4", 00:21:46.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.026 "is_configured": false, 00:21:46.026 "data_offset": 0, 00:21:46.026 "data_size": 0 00:21:46.026 } 00:21:46.026 ] 00:21:46.026 }' 00:21:46.026 22:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.026 22:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:46.962 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:46.962 [2024-07-15 22:50:31.657005] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:46.962 [2024-07-15 22:50:31.657046] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2563310 name Existed_Raid, state configuring 00:21:46.962 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:47.221 [2024-07-15 22:50:31.905689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:47.221 [2024-07-15 22:50:31.907143] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:47.221 [2024-07-15 22:50:31.907176] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:47.221 [2024-07-15 22:50:31.907187] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:47.221 [2024-07-15 22:50:31.907199] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:47.221 [2024-07-15 22:50:31.907207] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:47.221 [2024-07-15 22:50:31.907219] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.221 22:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:47.221 22:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.221 "name": "Existed_Raid", 00:21:47.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.221 "strip_size_kb": 0, 00:21:47.221 "state": "configuring", 00:21:47.221 "raid_level": "raid1", 00:21:47.221 "superblock": false, 00:21:47.221 "num_base_bdevs": 4, 00:21:47.221 "num_base_bdevs_discovered": 1, 00:21:47.221 "num_base_bdevs_operational": 4, 00:21:47.221 "base_bdevs_list": [ 00:21:47.221 { 00:21:47.221 "name": "BaseBdev1", 00:21:47.221 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:47.221 "is_configured": true, 00:21:47.221 "data_offset": 0, 00:21:47.221 "data_size": 65536 00:21:47.221 }, 00:21:47.221 { 00:21:47.221 "name": "BaseBdev2", 00:21:47.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.221 "is_configured": false, 00:21:47.221 "data_offset": 0, 00:21:47.221 "data_size": 0 00:21:47.221 }, 00:21:47.221 { 00:21:47.221 "name": "BaseBdev3", 00:21:47.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.221 "is_configured": false, 00:21:47.221 "data_offset": 0, 00:21:47.221 "data_size": 0 00:21:47.221 }, 00:21:47.221 { 00:21:47.221 "name": "BaseBdev4", 00:21:47.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.221 "is_configured": false, 00:21:47.221 "data_offset": 0, 00:21:47.221 "data_size": 0 00:21:47.221 } 00:21:47.221 ] 00:21:47.221 }' 00:21:47.221 22:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.221 22:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:47.787 22:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:48.044 [2024-07-15 22:50:32.863715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:48.044 BaseBdev2 00:21:48.044 22:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:48.044 22:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:48.044 22:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:48.044 22:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:48.044 22:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:48.044 22:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:48.044 22:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:48.303 22:50:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:48.561 [ 00:21:48.561 { 00:21:48.561 "name": "BaseBdev2", 00:21:48.561 "aliases": [ 00:21:48.561 "e97fd2be-b7e0-4613-a9c4-085609b940da" 00:21:48.561 ], 00:21:48.561 "product_name": "Malloc disk", 00:21:48.561 "block_size": 512, 00:21:48.561 "num_blocks": 65536, 00:21:48.561 "uuid": "e97fd2be-b7e0-4613-a9c4-085609b940da", 00:21:48.561 "assigned_rate_limits": { 00:21:48.561 "rw_ios_per_sec": 0, 00:21:48.561 "rw_mbytes_per_sec": 0, 00:21:48.561 "r_mbytes_per_sec": 0, 00:21:48.561 "w_mbytes_per_sec": 0 00:21:48.561 }, 00:21:48.561 "claimed": true, 00:21:48.561 "claim_type": "exclusive_write", 00:21:48.561 "zoned": false, 00:21:48.561 "supported_io_types": { 00:21:48.561 "read": true, 00:21:48.561 "write": true, 00:21:48.561 "unmap": true, 00:21:48.561 "flush": true, 00:21:48.561 "reset": true, 00:21:48.561 "nvme_admin": false, 00:21:48.561 "nvme_io": false, 00:21:48.561 "nvme_io_md": false, 00:21:48.561 "write_zeroes": true, 00:21:48.561 "zcopy": true, 00:21:48.561 "get_zone_info": false, 00:21:48.561 "zone_management": false, 00:21:48.561 "zone_append": false, 00:21:48.561 "compare": false, 00:21:48.561 "compare_and_write": false, 00:21:48.561 "abort": true, 00:21:48.561 "seek_hole": false, 00:21:48.561 "seek_data": false, 00:21:48.561 "copy": true, 00:21:48.561 "nvme_iov_md": false 00:21:48.561 }, 00:21:48.561 "memory_domains": [ 00:21:48.561 { 00:21:48.561 "dma_device_id": "system", 00:21:48.561 "dma_device_type": 1 00:21:48.561 }, 00:21:48.561 { 00:21:48.561 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.561 "dma_device_type": 2 00:21:48.561 } 00:21:48.561 ], 00:21:48.561 "driver_specific": {} 00:21:48.561 } 00:21:48.561 ] 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.561 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.821 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.821 "name": "Existed_Raid", 00:21:48.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.821 "strip_size_kb": 0, 00:21:48.821 "state": "configuring", 00:21:48.821 "raid_level": "raid1", 00:21:48.821 "superblock": false, 00:21:48.821 "num_base_bdevs": 4, 00:21:48.821 "num_base_bdevs_discovered": 2, 00:21:48.821 "num_base_bdevs_operational": 4, 00:21:48.821 "base_bdevs_list": [ 00:21:48.821 { 00:21:48.821 "name": "BaseBdev1", 00:21:48.821 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:48.821 "is_configured": true, 00:21:48.821 "data_offset": 0, 00:21:48.821 "data_size": 65536 00:21:48.821 }, 00:21:48.821 { 00:21:48.821 "name": "BaseBdev2", 00:21:48.821 "uuid": "e97fd2be-b7e0-4613-a9c4-085609b940da", 00:21:48.821 "is_configured": true, 00:21:48.821 "data_offset": 0, 00:21:48.821 "data_size": 65536 00:21:48.821 }, 00:21:48.821 { 00:21:48.821 "name": "BaseBdev3", 00:21:48.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.821 "is_configured": false, 00:21:48.821 "data_offset": 0, 00:21:48.821 "data_size": 0 00:21:48.821 }, 00:21:48.821 { 00:21:48.821 "name": "BaseBdev4", 00:21:48.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.821 "is_configured": false, 00:21:48.821 "data_offset": 0, 00:21:48.821 "data_size": 0 00:21:48.821 } 00:21:48.821 ] 00:21:48.821 }' 00:21:48.821 22:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.821 22:50:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.388 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:49.646 [2024-07-15 22:50:34.399245] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:49.646 BaseBdev3 00:21:49.646 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:49.646 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:49.646 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:49.646 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:49.646 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:49.646 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:49.646 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:49.905 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:50.163 [ 00:21:50.163 { 00:21:50.163 "name": "BaseBdev3", 00:21:50.163 "aliases": [ 00:21:50.163 "20d71fd8-2a7a-4091-acea-a9948844330b" 00:21:50.163 ], 00:21:50.163 "product_name": "Malloc disk", 00:21:50.163 "block_size": 512, 00:21:50.163 "num_blocks": 65536, 00:21:50.163 "uuid": "20d71fd8-2a7a-4091-acea-a9948844330b", 00:21:50.163 "assigned_rate_limits": { 00:21:50.163 "rw_ios_per_sec": 0, 00:21:50.163 "rw_mbytes_per_sec": 0, 00:21:50.163 "r_mbytes_per_sec": 0, 00:21:50.163 "w_mbytes_per_sec": 0 00:21:50.163 }, 00:21:50.163 "claimed": true, 00:21:50.163 "claim_type": "exclusive_write", 00:21:50.163 "zoned": false, 00:21:50.163 "supported_io_types": { 00:21:50.163 "read": true, 00:21:50.163 "write": true, 00:21:50.163 "unmap": true, 00:21:50.163 "flush": true, 00:21:50.163 "reset": true, 00:21:50.163 "nvme_admin": false, 00:21:50.163 "nvme_io": false, 00:21:50.163 "nvme_io_md": false, 00:21:50.163 "write_zeroes": true, 00:21:50.163 "zcopy": true, 00:21:50.163 "get_zone_info": false, 00:21:50.163 "zone_management": false, 00:21:50.163 "zone_append": false, 00:21:50.163 "compare": false, 00:21:50.163 "compare_and_write": false, 00:21:50.163 "abort": true, 00:21:50.163 "seek_hole": false, 00:21:50.163 "seek_data": false, 00:21:50.163 "copy": true, 00:21:50.163 "nvme_iov_md": false 00:21:50.163 }, 00:21:50.163 "memory_domains": [ 00:21:50.163 { 00:21:50.163 "dma_device_id": "system", 00:21:50.163 "dma_device_type": 1 00:21:50.163 }, 00:21:50.163 { 00:21:50.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:50.163 "dma_device_type": 2 00:21:50.163 } 00:21:50.163 ], 00:21:50.163 "driver_specific": {} 00:21:50.163 } 00:21:50.163 ] 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.163 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.164 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:50.164 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.164 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.164 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.164 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.164 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.164 22:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.423 22:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.423 "name": "Existed_Raid", 00:21:50.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.423 "strip_size_kb": 0, 00:21:50.423 "state": "configuring", 00:21:50.423 "raid_level": "raid1", 00:21:50.423 "superblock": false, 00:21:50.423 "num_base_bdevs": 4, 00:21:50.423 "num_base_bdevs_discovered": 3, 00:21:50.423 "num_base_bdevs_operational": 4, 00:21:50.423 "base_bdevs_list": [ 00:21:50.423 { 00:21:50.423 "name": "BaseBdev1", 00:21:50.423 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:50.423 "is_configured": true, 00:21:50.423 "data_offset": 0, 00:21:50.423 "data_size": 65536 00:21:50.423 }, 00:21:50.423 { 00:21:50.423 "name": "BaseBdev2", 00:21:50.423 "uuid": "e97fd2be-b7e0-4613-a9c4-085609b940da", 00:21:50.423 "is_configured": true, 00:21:50.423 "data_offset": 0, 00:21:50.423 "data_size": 65536 00:21:50.423 }, 00:21:50.423 { 00:21:50.423 "name": "BaseBdev3", 00:21:50.423 "uuid": "20d71fd8-2a7a-4091-acea-a9948844330b", 00:21:50.423 "is_configured": true, 00:21:50.423 "data_offset": 0, 00:21:50.423 "data_size": 65536 00:21:50.423 }, 00:21:50.423 { 00:21:50.423 "name": "BaseBdev4", 00:21:50.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.423 "is_configured": false, 00:21:50.423 "data_offset": 0, 00:21:50.423 "data_size": 0 00:21:50.423 } 00:21:50.423 ] 00:21:50.423 }' 00:21:50.423 22:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.423 22:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.990 22:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:51.250 [2024-07-15 22:50:35.918643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:51.250 [2024-07-15 22:50:35.918679] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2564350 00:21:51.250 [2024-07-15 22:50:35.918687] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:51.250 [2024-07-15 22:50:35.918947] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2564020 00:21:51.250 [2024-07-15 22:50:35.919080] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2564350 00:21:51.250 [2024-07-15 22:50:35.919090] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2564350 00:21:51.250 [2024-07-15 22:50:35.919252] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.250 BaseBdev4 00:21:51.250 22:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:51.250 22:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:51.250 22:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:51.250 22:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:51.250 22:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:51.250 22:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:51.250 22:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:51.508 22:50:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:51.509 [ 00:21:51.509 { 00:21:51.509 "name": "BaseBdev4", 00:21:51.509 "aliases": [ 00:21:51.509 "a72c120f-b3fb-44b7-b637-0bb70b55e5b8" 00:21:51.509 ], 00:21:51.509 "product_name": "Malloc disk", 00:21:51.509 "block_size": 512, 00:21:51.509 "num_blocks": 65536, 00:21:51.509 "uuid": "a72c120f-b3fb-44b7-b637-0bb70b55e5b8", 00:21:51.509 "assigned_rate_limits": { 00:21:51.509 "rw_ios_per_sec": 0, 00:21:51.509 "rw_mbytes_per_sec": 0, 00:21:51.509 "r_mbytes_per_sec": 0, 00:21:51.509 "w_mbytes_per_sec": 0 00:21:51.509 }, 00:21:51.509 "claimed": true, 00:21:51.509 "claim_type": "exclusive_write", 00:21:51.509 "zoned": false, 00:21:51.509 "supported_io_types": { 00:21:51.509 "read": true, 00:21:51.509 "write": true, 00:21:51.509 "unmap": true, 00:21:51.509 "flush": true, 00:21:51.509 "reset": true, 00:21:51.509 "nvme_admin": false, 00:21:51.509 "nvme_io": false, 00:21:51.509 "nvme_io_md": false, 00:21:51.509 "write_zeroes": true, 00:21:51.509 "zcopy": true, 00:21:51.509 "get_zone_info": false, 00:21:51.509 "zone_management": false, 00:21:51.509 "zone_append": false, 00:21:51.509 "compare": false, 00:21:51.509 "compare_and_write": false, 00:21:51.509 "abort": true, 00:21:51.509 "seek_hole": false, 00:21:51.509 "seek_data": false, 00:21:51.509 "copy": true, 00:21:51.509 "nvme_iov_md": false 00:21:51.509 }, 00:21:51.509 "memory_domains": [ 00:21:51.509 { 00:21:51.509 "dma_device_id": "system", 00:21:51.509 "dma_device_type": 1 00:21:51.509 }, 00:21:51.509 { 00:21:51.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.509 "dma_device_type": 2 00:21:51.509 } 00:21:51.509 ], 00:21:51.509 "driver_specific": {} 00:21:51.509 } 00:21:51.509 ] 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.768 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.028 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.028 "name": "Existed_Raid", 00:21:52.028 "uuid": "a4d6da0c-8464-41a8-b01b-c1a3741c09a9", 00:21:52.028 "strip_size_kb": 0, 00:21:52.028 "state": "online", 00:21:52.028 "raid_level": "raid1", 00:21:52.028 "superblock": false, 00:21:52.028 "num_base_bdevs": 4, 00:21:52.028 "num_base_bdevs_discovered": 4, 00:21:52.028 "num_base_bdevs_operational": 4, 00:21:52.028 "base_bdevs_list": [ 00:21:52.028 { 00:21:52.028 "name": "BaseBdev1", 00:21:52.028 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:52.028 "is_configured": true, 00:21:52.028 "data_offset": 0, 00:21:52.028 "data_size": 65536 00:21:52.028 }, 00:21:52.028 { 00:21:52.028 "name": "BaseBdev2", 00:21:52.028 "uuid": "e97fd2be-b7e0-4613-a9c4-085609b940da", 00:21:52.028 "is_configured": true, 00:21:52.028 "data_offset": 0, 00:21:52.028 "data_size": 65536 00:21:52.028 }, 00:21:52.028 { 00:21:52.028 "name": "BaseBdev3", 00:21:52.028 "uuid": "20d71fd8-2a7a-4091-acea-a9948844330b", 00:21:52.028 "is_configured": true, 00:21:52.028 "data_offset": 0, 00:21:52.028 "data_size": 65536 00:21:52.028 }, 00:21:52.028 { 00:21:52.028 "name": "BaseBdev4", 00:21:52.028 "uuid": "a72c120f-b3fb-44b7-b637-0bb70b55e5b8", 00:21:52.028 "is_configured": true, 00:21:52.028 "data_offset": 0, 00:21:52.028 "data_size": 65536 00:21:52.028 } 00:21:52.028 ] 00:21:52.028 }' 00:21:52.028 22:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.028 22:50:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:52.596 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:52.855 [2024-07-15 22:50:37.523288] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:52.855 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:52.855 "name": "Existed_Raid", 00:21:52.855 "aliases": [ 00:21:52.855 "a4d6da0c-8464-41a8-b01b-c1a3741c09a9" 00:21:52.855 ], 00:21:52.855 "product_name": "Raid Volume", 00:21:52.855 "block_size": 512, 00:21:52.855 "num_blocks": 65536, 00:21:52.855 "uuid": "a4d6da0c-8464-41a8-b01b-c1a3741c09a9", 00:21:52.855 "assigned_rate_limits": { 00:21:52.855 "rw_ios_per_sec": 0, 00:21:52.855 "rw_mbytes_per_sec": 0, 00:21:52.855 "r_mbytes_per_sec": 0, 00:21:52.855 "w_mbytes_per_sec": 0 00:21:52.855 }, 00:21:52.855 "claimed": false, 00:21:52.855 "zoned": false, 00:21:52.855 "supported_io_types": { 00:21:52.855 "read": true, 00:21:52.855 "write": true, 00:21:52.855 "unmap": false, 00:21:52.855 "flush": false, 00:21:52.855 "reset": true, 00:21:52.855 "nvme_admin": false, 00:21:52.855 "nvme_io": false, 00:21:52.855 "nvme_io_md": false, 00:21:52.855 "write_zeroes": true, 00:21:52.855 "zcopy": false, 00:21:52.855 "get_zone_info": false, 00:21:52.855 "zone_management": false, 00:21:52.855 "zone_append": false, 00:21:52.855 "compare": false, 00:21:52.855 "compare_and_write": false, 00:21:52.855 "abort": false, 00:21:52.855 "seek_hole": false, 00:21:52.855 "seek_data": false, 00:21:52.855 "copy": false, 00:21:52.855 "nvme_iov_md": false 00:21:52.855 }, 00:21:52.855 "memory_domains": [ 00:21:52.855 { 00:21:52.855 "dma_device_id": "system", 00:21:52.855 "dma_device_type": 1 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.855 "dma_device_type": 2 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "dma_device_id": "system", 00:21:52.855 "dma_device_type": 1 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.855 "dma_device_type": 2 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "dma_device_id": "system", 00:21:52.855 "dma_device_type": 1 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.855 "dma_device_type": 2 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "dma_device_id": "system", 00:21:52.855 "dma_device_type": 1 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.855 "dma_device_type": 2 00:21:52.855 } 00:21:52.855 ], 00:21:52.855 "driver_specific": { 00:21:52.855 "raid": { 00:21:52.855 "uuid": "a4d6da0c-8464-41a8-b01b-c1a3741c09a9", 00:21:52.855 "strip_size_kb": 0, 00:21:52.855 "state": "online", 00:21:52.855 "raid_level": "raid1", 00:21:52.855 "superblock": false, 00:21:52.855 "num_base_bdevs": 4, 00:21:52.855 "num_base_bdevs_discovered": 4, 00:21:52.855 "num_base_bdevs_operational": 4, 00:21:52.855 "base_bdevs_list": [ 00:21:52.855 { 00:21:52.855 "name": "BaseBdev1", 00:21:52.855 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:52.855 "is_configured": true, 00:21:52.855 "data_offset": 0, 00:21:52.855 "data_size": 65536 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "name": "BaseBdev2", 00:21:52.855 "uuid": "e97fd2be-b7e0-4613-a9c4-085609b940da", 00:21:52.855 "is_configured": true, 00:21:52.855 "data_offset": 0, 00:21:52.855 "data_size": 65536 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "name": "BaseBdev3", 00:21:52.855 "uuid": "20d71fd8-2a7a-4091-acea-a9948844330b", 00:21:52.855 "is_configured": true, 00:21:52.855 "data_offset": 0, 00:21:52.855 "data_size": 65536 00:21:52.855 }, 00:21:52.855 { 00:21:52.855 "name": "BaseBdev4", 00:21:52.855 "uuid": "a72c120f-b3fb-44b7-b637-0bb70b55e5b8", 00:21:52.855 "is_configured": true, 00:21:52.855 "data_offset": 0, 00:21:52.855 "data_size": 65536 00:21:52.855 } 00:21:52.855 ] 00:21:52.855 } 00:21:52.855 } 00:21:52.855 }' 00:21:52.855 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:52.855 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:52.855 BaseBdev2 00:21:52.855 BaseBdev3 00:21:52.855 BaseBdev4' 00:21:52.855 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:52.856 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:52.856 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.114 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.114 "name": "BaseBdev1", 00:21:53.114 "aliases": [ 00:21:53.114 "c7920377-d305-4b8f-833f-f0839d1fa799" 00:21:53.114 ], 00:21:53.114 "product_name": "Malloc disk", 00:21:53.114 "block_size": 512, 00:21:53.114 "num_blocks": 65536, 00:21:53.114 "uuid": "c7920377-d305-4b8f-833f-f0839d1fa799", 00:21:53.114 "assigned_rate_limits": { 00:21:53.114 "rw_ios_per_sec": 0, 00:21:53.114 "rw_mbytes_per_sec": 0, 00:21:53.114 "r_mbytes_per_sec": 0, 00:21:53.114 "w_mbytes_per_sec": 0 00:21:53.114 }, 00:21:53.114 "claimed": true, 00:21:53.114 "claim_type": "exclusive_write", 00:21:53.114 "zoned": false, 00:21:53.114 "supported_io_types": { 00:21:53.114 "read": true, 00:21:53.114 "write": true, 00:21:53.114 "unmap": true, 00:21:53.114 "flush": true, 00:21:53.114 "reset": true, 00:21:53.114 "nvme_admin": false, 00:21:53.114 "nvme_io": false, 00:21:53.114 "nvme_io_md": false, 00:21:53.114 "write_zeroes": true, 00:21:53.114 "zcopy": true, 00:21:53.114 "get_zone_info": false, 00:21:53.114 "zone_management": false, 00:21:53.114 "zone_append": false, 00:21:53.114 "compare": false, 00:21:53.114 "compare_and_write": false, 00:21:53.114 "abort": true, 00:21:53.114 "seek_hole": false, 00:21:53.114 "seek_data": false, 00:21:53.114 "copy": true, 00:21:53.114 "nvme_iov_md": false 00:21:53.114 }, 00:21:53.114 "memory_domains": [ 00:21:53.114 { 00:21:53.114 "dma_device_id": "system", 00:21:53.114 "dma_device_type": 1 00:21:53.114 }, 00:21:53.114 { 00:21:53.114 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.114 "dma_device_type": 2 00:21:53.114 } 00:21:53.114 ], 00:21:53.114 "driver_specific": {} 00:21:53.114 }' 00:21:53.114 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.114 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.114 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.114 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.114 22:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:53.372 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.630 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.630 "name": "BaseBdev2", 00:21:53.630 "aliases": [ 00:21:53.630 "e97fd2be-b7e0-4613-a9c4-085609b940da" 00:21:53.630 ], 00:21:53.630 "product_name": "Malloc disk", 00:21:53.630 "block_size": 512, 00:21:53.630 "num_blocks": 65536, 00:21:53.630 "uuid": "e97fd2be-b7e0-4613-a9c4-085609b940da", 00:21:53.630 "assigned_rate_limits": { 00:21:53.630 "rw_ios_per_sec": 0, 00:21:53.630 "rw_mbytes_per_sec": 0, 00:21:53.630 "r_mbytes_per_sec": 0, 00:21:53.630 "w_mbytes_per_sec": 0 00:21:53.630 }, 00:21:53.630 "claimed": true, 00:21:53.630 "claim_type": "exclusive_write", 00:21:53.630 "zoned": false, 00:21:53.630 "supported_io_types": { 00:21:53.630 "read": true, 00:21:53.630 "write": true, 00:21:53.630 "unmap": true, 00:21:53.630 "flush": true, 00:21:53.630 "reset": true, 00:21:53.630 "nvme_admin": false, 00:21:53.630 "nvme_io": false, 00:21:53.630 "nvme_io_md": false, 00:21:53.630 "write_zeroes": true, 00:21:53.630 "zcopy": true, 00:21:53.630 "get_zone_info": false, 00:21:53.630 "zone_management": false, 00:21:53.630 "zone_append": false, 00:21:53.630 "compare": false, 00:21:53.630 "compare_and_write": false, 00:21:53.630 "abort": true, 00:21:53.630 "seek_hole": false, 00:21:53.630 "seek_data": false, 00:21:53.630 "copy": true, 00:21:53.630 "nvme_iov_md": false 00:21:53.630 }, 00:21:53.630 "memory_domains": [ 00:21:53.630 { 00:21:53.630 "dma_device_id": "system", 00:21:53.630 "dma_device_type": 1 00:21:53.630 }, 00:21:53.630 { 00:21:53.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.630 "dma_device_type": 2 00:21:53.630 } 00:21:53.630 ], 00:21:53.630 "driver_specific": {} 00:21:53.630 }' 00:21:53.630 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.630 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.630 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.630 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:53.887 22:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.145 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.145 "name": "BaseBdev3", 00:21:54.145 "aliases": [ 00:21:54.145 "20d71fd8-2a7a-4091-acea-a9948844330b" 00:21:54.145 ], 00:21:54.145 "product_name": "Malloc disk", 00:21:54.145 "block_size": 512, 00:21:54.145 "num_blocks": 65536, 00:21:54.145 "uuid": "20d71fd8-2a7a-4091-acea-a9948844330b", 00:21:54.145 "assigned_rate_limits": { 00:21:54.145 "rw_ios_per_sec": 0, 00:21:54.145 "rw_mbytes_per_sec": 0, 00:21:54.145 "r_mbytes_per_sec": 0, 00:21:54.145 "w_mbytes_per_sec": 0 00:21:54.145 }, 00:21:54.145 "claimed": true, 00:21:54.145 "claim_type": "exclusive_write", 00:21:54.145 "zoned": false, 00:21:54.145 "supported_io_types": { 00:21:54.145 "read": true, 00:21:54.145 "write": true, 00:21:54.145 "unmap": true, 00:21:54.145 "flush": true, 00:21:54.145 "reset": true, 00:21:54.145 "nvme_admin": false, 00:21:54.145 "nvme_io": false, 00:21:54.145 "nvme_io_md": false, 00:21:54.145 "write_zeroes": true, 00:21:54.145 "zcopy": true, 00:21:54.145 "get_zone_info": false, 00:21:54.145 "zone_management": false, 00:21:54.145 "zone_append": false, 00:21:54.145 "compare": false, 00:21:54.145 "compare_and_write": false, 00:21:54.145 "abort": true, 00:21:54.145 "seek_hole": false, 00:21:54.145 "seek_data": false, 00:21:54.145 "copy": true, 00:21:54.145 "nvme_iov_md": false 00:21:54.145 }, 00:21:54.145 "memory_domains": [ 00:21:54.145 { 00:21:54.145 "dma_device_id": "system", 00:21:54.145 "dma_device_type": 1 00:21:54.145 }, 00:21:54.145 { 00:21:54.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.145 "dma_device_type": 2 00:21:54.145 } 00:21:54.145 ], 00:21:54.145 "driver_specific": {} 00:21:54.145 }' 00:21:54.145 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.403 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.663 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.663 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.663 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.663 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:54.663 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.922 "name": "BaseBdev4", 00:21:54.922 "aliases": [ 00:21:54.922 "a72c120f-b3fb-44b7-b637-0bb70b55e5b8" 00:21:54.922 ], 00:21:54.922 "product_name": "Malloc disk", 00:21:54.922 "block_size": 512, 00:21:54.922 "num_blocks": 65536, 00:21:54.922 "uuid": "a72c120f-b3fb-44b7-b637-0bb70b55e5b8", 00:21:54.922 "assigned_rate_limits": { 00:21:54.922 "rw_ios_per_sec": 0, 00:21:54.922 "rw_mbytes_per_sec": 0, 00:21:54.922 "r_mbytes_per_sec": 0, 00:21:54.922 "w_mbytes_per_sec": 0 00:21:54.922 }, 00:21:54.922 "claimed": true, 00:21:54.922 "claim_type": "exclusive_write", 00:21:54.922 "zoned": false, 00:21:54.922 "supported_io_types": { 00:21:54.922 "read": true, 00:21:54.922 "write": true, 00:21:54.922 "unmap": true, 00:21:54.922 "flush": true, 00:21:54.922 "reset": true, 00:21:54.922 "nvme_admin": false, 00:21:54.922 "nvme_io": false, 00:21:54.922 "nvme_io_md": false, 00:21:54.922 "write_zeroes": true, 00:21:54.922 "zcopy": true, 00:21:54.922 "get_zone_info": false, 00:21:54.922 "zone_management": false, 00:21:54.922 "zone_append": false, 00:21:54.922 "compare": false, 00:21:54.922 "compare_and_write": false, 00:21:54.922 "abort": true, 00:21:54.922 "seek_hole": false, 00:21:54.922 "seek_data": false, 00:21:54.922 "copy": true, 00:21:54.922 "nvme_iov_md": false 00:21:54.922 }, 00:21:54.922 "memory_domains": [ 00:21:54.922 { 00:21:54.922 "dma_device_id": "system", 00:21:54.922 "dma_device_type": 1 00:21:54.922 }, 00:21:54.922 { 00:21:54.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.922 "dma_device_type": 2 00:21:54.922 } 00:21:54.922 ], 00:21:54.922 "driver_specific": {} 00:21:54.922 }' 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.922 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.181 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.181 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.181 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.181 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.181 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.181 22:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:55.440 [2024-07-15 22:50:40.194222] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.440 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:55.698 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.698 "name": "Existed_Raid", 00:21:55.698 "uuid": "a4d6da0c-8464-41a8-b01b-c1a3741c09a9", 00:21:55.698 "strip_size_kb": 0, 00:21:55.698 "state": "online", 00:21:55.698 "raid_level": "raid1", 00:21:55.698 "superblock": false, 00:21:55.698 "num_base_bdevs": 4, 00:21:55.698 "num_base_bdevs_discovered": 3, 00:21:55.698 "num_base_bdevs_operational": 3, 00:21:55.698 "base_bdevs_list": [ 00:21:55.698 { 00:21:55.698 "name": null, 00:21:55.698 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.698 "is_configured": false, 00:21:55.698 "data_offset": 0, 00:21:55.698 "data_size": 65536 00:21:55.698 }, 00:21:55.698 { 00:21:55.698 "name": "BaseBdev2", 00:21:55.698 "uuid": "e97fd2be-b7e0-4613-a9c4-085609b940da", 00:21:55.698 "is_configured": true, 00:21:55.698 "data_offset": 0, 00:21:55.698 "data_size": 65536 00:21:55.698 }, 00:21:55.698 { 00:21:55.698 "name": "BaseBdev3", 00:21:55.698 "uuid": "20d71fd8-2a7a-4091-acea-a9948844330b", 00:21:55.698 "is_configured": true, 00:21:55.698 "data_offset": 0, 00:21:55.698 "data_size": 65536 00:21:55.698 }, 00:21:55.698 { 00:21:55.698 "name": "BaseBdev4", 00:21:55.698 "uuid": "a72c120f-b3fb-44b7-b637-0bb70b55e5b8", 00:21:55.698 "is_configured": true, 00:21:55.698 "data_offset": 0, 00:21:55.698 "data_size": 65536 00:21:55.698 } 00:21:55.698 ] 00:21:55.698 }' 00:21:55.698 22:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.698 22:50:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.265 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:56.265 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:56.265 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.265 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:56.536 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:56.536 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:56.536 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:57.102 [2024-07-15 22:50:41.728209] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:57.102 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:57.102 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:57.102 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.102 22:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:57.360 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:57.360 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:57.360 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:57.360 [2024-07-15 22:50:42.241812] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:57.618 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:57.618 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:57.618 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.618 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:57.618 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:57.618 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:57.618 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:57.878 [2024-07-15 22:50:42.751225] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:57.878 [2024-07-15 22:50:42.751306] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:57.878 [2024-07-15 22:50:42.762227] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:57.878 [2024-07-15 22:50:42.762264] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:57.878 [2024-07-15 22:50:42.762276] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2564350 name Existed_Raid, state offline 00:21:57.878 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:57.878 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:57.879 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.879 22:50:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:58.163 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:58.163 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:58.163 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:58.163 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:58.163 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:58.163 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:58.420 BaseBdev2 00:21:58.420 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:58.420 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:58.420 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:58.420 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:58.420 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:58.420 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:58.420 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:58.676 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:58.933 [ 00:21:58.933 { 00:21:58.933 "name": "BaseBdev2", 00:21:58.933 "aliases": [ 00:21:58.933 "5f9badb4-ed67-4be4-a49d-b034c1af460c" 00:21:58.933 ], 00:21:58.933 "product_name": "Malloc disk", 00:21:58.933 "block_size": 512, 00:21:58.933 "num_blocks": 65536, 00:21:58.933 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:21:58.933 "assigned_rate_limits": { 00:21:58.933 "rw_ios_per_sec": 0, 00:21:58.933 "rw_mbytes_per_sec": 0, 00:21:58.933 "r_mbytes_per_sec": 0, 00:21:58.933 "w_mbytes_per_sec": 0 00:21:58.933 }, 00:21:58.933 "claimed": false, 00:21:58.933 "zoned": false, 00:21:58.933 "supported_io_types": { 00:21:58.933 "read": true, 00:21:58.933 "write": true, 00:21:58.933 "unmap": true, 00:21:58.933 "flush": true, 00:21:58.933 "reset": true, 00:21:58.933 "nvme_admin": false, 00:21:58.933 "nvme_io": false, 00:21:58.933 "nvme_io_md": false, 00:21:58.933 "write_zeroes": true, 00:21:58.933 "zcopy": true, 00:21:58.933 "get_zone_info": false, 00:21:58.933 "zone_management": false, 00:21:58.933 "zone_append": false, 00:21:58.933 "compare": false, 00:21:58.933 "compare_and_write": false, 00:21:58.933 "abort": true, 00:21:58.933 "seek_hole": false, 00:21:58.933 "seek_data": false, 00:21:58.933 "copy": true, 00:21:58.933 "nvme_iov_md": false 00:21:58.933 }, 00:21:58.933 "memory_domains": [ 00:21:58.933 { 00:21:58.933 "dma_device_id": "system", 00:21:58.933 "dma_device_type": 1 00:21:58.933 }, 00:21:58.933 { 00:21:58.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.933 "dma_device_type": 2 00:21:58.933 } 00:21:58.933 ], 00:21:58.933 "driver_specific": {} 00:21:58.933 } 00:21:58.933 ] 00:21:58.933 22:50:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:58.933 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:58.933 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:58.933 22:50:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:59.191 BaseBdev3 00:21:59.191 22:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:59.191 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:59.191 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:59.191 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:59.191 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:59.191 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:59.191 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:59.448 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:59.705 [ 00:21:59.705 { 00:21:59.705 "name": "BaseBdev3", 00:21:59.705 "aliases": [ 00:21:59.705 "a033b197-80c9-4801-a91e-0339a68cf582" 00:21:59.705 ], 00:21:59.705 "product_name": "Malloc disk", 00:21:59.705 "block_size": 512, 00:21:59.706 "num_blocks": 65536, 00:21:59.706 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:21:59.706 "assigned_rate_limits": { 00:21:59.706 "rw_ios_per_sec": 0, 00:21:59.706 "rw_mbytes_per_sec": 0, 00:21:59.706 "r_mbytes_per_sec": 0, 00:21:59.706 "w_mbytes_per_sec": 0 00:21:59.706 }, 00:21:59.706 "claimed": false, 00:21:59.706 "zoned": false, 00:21:59.706 "supported_io_types": { 00:21:59.706 "read": true, 00:21:59.706 "write": true, 00:21:59.706 "unmap": true, 00:21:59.706 "flush": true, 00:21:59.706 "reset": true, 00:21:59.706 "nvme_admin": false, 00:21:59.706 "nvme_io": false, 00:21:59.706 "nvme_io_md": false, 00:21:59.706 "write_zeroes": true, 00:21:59.706 "zcopy": true, 00:21:59.706 "get_zone_info": false, 00:21:59.706 "zone_management": false, 00:21:59.706 "zone_append": false, 00:21:59.706 "compare": false, 00:21:59.706 "compare_and_write": false, 00:21:59.706 "abort": true, 00:21:59.706 "seek_hole": false, 00:21:59.706 "seek_data": false, 00:21:59.706 "copy": true, 00:21:59.706 "nvme_iov_md": false 00:21:59.706 }, 00:21:59.706 "memory_domains": [ 00:21:59.706 { 00:21:59.706 "dma_device_id": "system", 00:21:59.706 "dma_device_type": 1 00:21:59.706 }, 00:21:59.706 { 00:21:59.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.706 "dma_device_type": 2 00:21:59.706 } 00:21:59.706 ], 00:21:59.706 "driver_specific": {} 00:21:59.706 } 00:21:59.706 ] 00:21:59.706 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:59.706 22:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:59.706 22:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:59.706 22:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:59.963 BaseBdev4 00:21:59.963 22:50:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:59.963 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:59.963 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:59.963 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:59.963 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:59.963 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:59.963 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:00.221 22:50:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:00.480 [ 00:22:00.480 { 00:22:00.480 "name": "BaseBdev4", 00:22:00.480 "aliases": [ 00:22:00.480 "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e" 00:22:00.480 ], 00:22:00.480 "product_name": "Malloc disk", 00:22:00.480 "block_size": 512, 00:22:00.480 "num_blocks": 65536, 00:22:00.480 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:00.480 "assigned_rate_limits": { 00:22:00.480 "rw_ios_per_sec": 0, 00:22:00.480 "rw_mbytes_per_sec": 0, 00:22:00.480 "r_mbytes_per_sec": 0, 00:22:00.480 "w_mbytes_per_sec": 0 00:22:00.480 }, 00:22:00.480 "claimed": false, 00:22:00.480 "zoned": false, 00:22:00.480 "supported_io_types": { 00:22:00.480 "read": true, 00:22:00.480 "write": true, 00:22:00.480 "unmap": true, 00:22:00.480 "flush": true, 00:22:00.480 "reset": true, 00:22:00.480 "nvme_admin": false, 00:22:00.480 "nvme_io": false, 00:22:00.480 "nvme_io_md": false, 00:22:00.480 "write_zeroes": true, 00:22:00.480 "zcopy": true, 00:22:00.480 "get_zone_info": false, 00:22:00.480 "zone_management": false, 00:22:00.480 "zone_append": false, 00:22:00.480 "compare": false, 00:22:00.480 "compare_and_write": false, 00:22:00.480 "abort": true, 00:22:00.480 "seek_hole": false, 00:22:00.480 "seek_data": false, 00:22:00.480 "copy": true, 00:22:00.480 "nvme_iov_md": false 00:22:00.480 }, 00:22:00.480 "memory_domains": [ 00:22:00.480 { 00:22:00.480 "dma_device_id": "system", 00:22:00.480 "dma_device_type": 1 00:22:00.480 }, 00:22:00.480 { 00:22:00.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.480 "dma_device_type": 2 00:22:00.480 } 00:22:00.480 ], 00:22:00.480 "driver_specific": {} 00:22:00.480 } 00:22:00.480 ] 00:22:00.480 22:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:00.480 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:00.480 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:00.480 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:00.740 [2024-07-15 22:50:45.455663] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:00.740 [2024-07-15 22:50:45.455705] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:00.740 [2024-07-15 22:50:45.455725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:00.740 [2024-07-15 22:50:45.457102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:00.740 [2024-07-15 22:50:45.457151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.740 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:00.999 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.999 "name": "Existed_Raid", 00:22:00.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.999 "strip_size_kb": 0, 00:22:00.999 "state": "configuring", 00:22:00.999 "raid_level": "raid1", 00:22:00.999 "superblock": false, 00:22:00.999 "num_base_bdevs": 4, 00:22:00.999 "num_base_bdevs_discovered": 3, 00:22:00.999 "num_base_bdevs_operational": 4, 00:22:00.999 "base_bdevs_list": [ 00:22:00.999 { 00:22:00.999 "name": "BaseBdev1", 00:22:00.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.999 "is_configured": false, 00:22:00.999 "data_offset": 0, 00:22:00.999 "data_size": 0 00:22:00.999 }, 00:22:00.999 { 00:22:00.999 "name": "BaseBdev2", 00:22:00.999 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:00.999 "is_configured": true, 00:22:00.999 "data_offset": 0, 00:22:00.999 "data_size": 65536 00:22:00.999 }, 00:22:00.999 { 00:22:00.999 "name": "BaseBdev3", 00:22:00.999 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:00.999 "is_configured": true, 00:22:00.999 "data_offset": 0, 00:22:01.000 "data_size": 65536 00:22:01.000 }, 00:22:01.000 { 00:22:01.000 "name": "BaseBdev4", 00:22:01.000 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:01.000 "is_configured": true, 00:22:01.000 "data_offset": 0, 00:22:01.000 "data_size": 65536 00:22:01.000 } 00:22:01.000 ] 00:22:01.000 }' 00:22:01.000 22:50:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.000 22:50:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.568 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:01.828 [2024-07-15 22:50:46.534499] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.828 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:02.087 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:02.087 "name": "Existed_Raid", 00:22:02.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.087 "strip_size_kb": 0, 00:22:02.087 "state": "configuring", 00:22:02.087 "raid_level": "raid1", 00:22:02.087 "superblock": false, 00:22:02.087 "num_base_bdevs": 4, 00:22:02.087 "num_base_bdevs_discovered": 2, 00:22:02.087 "num_base_bdevs_operational": 4, 00:22:02.087 "base_bdevs_list": [ 00:22:02.087 { 00:22:02.087 "name": "BaseBdev1", 00:22:02.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.087 "is_configured": false, 00:22:02.087 "data_offset": 0, 00:22:02.087 "data_size": 0 00:22:02.087 }, 00:22:02.087 { 00:22:02.087 "name": null, 00:22:02.087 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:02.087 "is_configured": false, 00:22:02.087 "data_offset": 0, 00:22:02.087 "data_size": 65536 00:22:02.087 }, 00:22:02.087 { 00:22:02.087 "name": "BaseBdev3", 00:22:02.087 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:02.087 "is_configured": true, 00:22:02.087 "data_offset": 0, 00:22:02.087 "data_size": 65536 00:22:02.087 }, 00:22:02.087 { 00:22:02.087 "name": "BaseBdev4", 00:22:02.087 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:02.087 "is_configured": true, 00:22:02.087 "data_offset": 0, 00:22:02.087 "data_size": 65536 00:22:02.087 } 00:22:02.087 ] 00:22:02.087 }' 00:22:02.087 22:50:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:02.087 22:50:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:02.661 22:50:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:02.661 22:50:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.920 22:50:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:02.920 22:50:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:03.487 [2024-07-15 22:50:48.130065] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:03.487 BaseBdev1 00:22:03.487 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:03.487 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:03.487 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:03.487 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:03.487 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:03.487 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:03.487 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:03.746 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:04.005 [ 00:22:04.005 { 00:22:04.005 "name": "BaseBdev1", 00:22:04.005 "aliases": [ 00:22:04.005 "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8" 00:22:04.005 ], 00:22:04.005 "product_name": "Malloc disk", 00:22:04.005 "block_size": 512, 00:22:04.005 "num_blocks": 65536, 00:22:04.005 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:04.005 "assigned_rate_limits": { 00:22:04.005 "rw_ios_per_sec": 0, 00:22:04.005 "rw_mbytes_per_sec": 0, 00:22:04.005 "r_mbytes_per_sec": 0, 00:22:04.005 "w_mbytes_per_sec": 0 00:22:04.005 }, 00:22:04.005 "claimed": true, 00:22:04.005 "claim_type": "exclusive_write", 00:22:04.005 "zoned": false, 00:22:04.005 "supported_io_types": { 00:22:04.005 "read": true, 00:22:04.005 "write": true, 00:22:04.005 "unmap": true, 00:22:04.005 "flush": true, 00:22:04.005 "reset": true, 00:22:04.005 "nvme_admin": false, 00:22:04.005 "nvme_io": false, 00:22:04.005 "nvme_io_md": false, 00:22:04.005 "write_zeroes": true, 00:22:04.005 "zcopy": true, 00:22:04.005 "get_zone_info": false, 00:22:04.005 "zone_management": false, 00:22:04.005 "zone_append": false, 00:22:04.005 "compare": false, 00:22:04.005 "compare_and_write": false, 00:22:04.005 "abort": true, 00:22:04.005 "seek_hole": false, 00:22:04.005 "seek_data": false, 00:22:04.005 "copy": true, 00:22:04.005 "nvme_iov_md": false 00:22:04.005 }, 00:22:04.005 "memory_domains": [ 00:22:04.005 { 00:22:04.005 "dma_device_id": "system", 00:22:04.005 "dma_device_type": 1 00:22:04.005 }, 00:22:04.005 { 00:22:04.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.005 "dma_device_type": 2 00:22:04.005 } 00:22:04.005 ], 00:22:04.005 "driver_specific": {} 00:22:04.005 } 00:22:04.005 ] 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.005 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.264 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.264 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.264 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.264 22:50:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:04.264 22:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.264 "name": "Existed_Raid", 00:22:04.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.264 "strip_size_kb": 0, 00:22:04.264 "state": "configuring", 00:22:04.264 "raid_level": "raid1", 00:22:04.264 "superblock": false, 00:22:04.264 "num_base_bdevs": 4, 00:22:04.264 "num_base_bdevs_discovered": 3, 00:22:04.264 "num_base_bdevs_operational": 4, 00:22:04.264 "base_bdevs_list": [ 00:22:04.264 { 00:22:04.264 "name": "BaseBdev1", 00:22:04.264 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:04.264 "is_configured": true, 00:22:04.264 "data_offset": 0, 00:22:04.264 "data_size": 65536 00:22:04.264 }, 00:22:04.264 { 00:22:04.264 "name": null, 00:22:04.264 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:04.264 "is_configured": false, 00:22:04.264 "data_offset": 0, 00:22:04.264 "data_size": 65536 00:22:04.264 }, 00:22:04.264 { 00:22:04.264 "name": "BaseBdev3", 00:22:04.264 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:04.264 "is_configured": true, 00:22:04.264 "data_offset": 0, 00:22:04.264 "data_size": 65536 00:22:04.264 }, 00:22:04.264 { 00:22:04.264 "name": "BaseBdev4", 00:22:04.264 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:04.264 "is_configured": true, 00:22:04.264 "data_offset": 0, 00:22:04.264 "data_size": 65536 00:22:04.264 } 00:22:04.264 ] 00:22:04.264 }' 00:22:04.264 22:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.264 22:50:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.200 22:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.200 22:50:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:05.200 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:05.200 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:05.459 [2024-07-15 22:50:50.247708] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:05.459 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:05.459 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:05.459 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:05.459 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.460 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:05.718 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:05.718 "name": "Existed_Raid", 00:22:05.718 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:05.718 "strip_size_kb": 0, 00:22:05.718 "state": "configuring", 00:22:05.718 "raid_level": "raid1", 00:22:05.718 "superblock": false, 00:22:05.718 "num_base_bdevs": 4, 00:22:05.718 "num_base_bdevs_discovered": 2, 00:22:05.718 "num_base_bdevs_operational": 4, 00:22:05.718 "base_bdevs_list": [ 00:22:05.718 { 00:22:05.718 "name": "BaseBdev1", 00:22:05.718 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:05.718 "is_configured": true, 00:22:05.718 "data_offset": 0, 00:22:05.718 "data_size": 65536 00:22:05.718 }, 00:22:05.718 { 00:22:05.718 "name": null, 00:22:05.718 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:05.718 "is_configured": false, 00:22:05.718 "data_offset": 0, 00:22:05.718 "data_size": 65536 00:22:05.718 }, 00:22:05.718 { 00:22:05.718 "name": null, 00:22:05.718 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:05.718 "is_configured": false, 00:22:05.719 "data_offset": 0, 00:22:05.719 "data_size": 65536 00:22:05.719 }, 00:22:05.719 { 00:22:05.719 "name": "BaseBdev4", 00:22:05.719 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:05.719 "is_configured": true, 00:22:05.719 "data_offset": 0, 00:22:05.719 "data_size": 65536 00:22:05.719 } 00:22:05.719 ] 00:22:05.719 }' 00:22:05.719 22:50:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:05.719 22:50:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.285 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.285 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:06.543 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:06.543 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:06.800 [2024-07-15 22:50:51.531142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.800 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:07.058 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.058 "name": "Existed_Raid", 00:22:07.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.058 "strip_size_kb": 0, 00:22:07.058 "state": "configuring", 00:22:07.058 "raid_level": "raid1", 00:22:07.058 "superblock": false, 00:22:07.058 "num_base_bdevs": 4, 00:22:07.058 "num_base_bdevs_discovered": 3, 00:22:07.058 "num_base_bdevs_operational": 4, 00:22:07.058 "base_bdevs_list": [ 00:22:07.058 { 00:22:07.058 "name": "BaseBdev1", 00:22:07.058 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:07.058 "is_configured": true, 00:22:07.058 "data_offset": 0, 00:22:07.058 "data_size": 65536 00:22:07.058 }, 00:22:07.058 { 00:22:07.058 "name": null, 00:22:07.058 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:07.058 "is_configured": false, 00:22:07.058 "data_offset": 0, 00:22:07.058 "data_size": 65536 00:22:07.058 }, 00:22:07.058 { 00:22:07.058 "name": "BaseBdev3", 00:22:07.058 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:07.058 "is_configured": true, 00:22:07.058 "data_offset": 0, 00:22:07.058 "data_size": 65536 00:22:07.058 }, 00:22:07.058 { 00:22:07.058 "name": "BaseBdev4", 00:22:07.058 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:07.058 "is_configured": true, 00:22:07.058 "data_offset": 0, 00:22:07.058 "data_size": 65536 00:22:07.058 } 00:22:07.058 ] 00:22:07.058 }' 00:22:07.058 22:50:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.058 22:50:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.625 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.625 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:07.883 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:07.883 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:08.142 [2024-07-15 22:50:52.798526] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.142 22:50:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:08.142 22:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.142 "name": "Existed_Raid", 00:22:08.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.142 "strip_size_kb": 0, 00:22:08.142 "state": "configuring", 00:22:08.142 "raid_level": "raid1", 00:22:08.142 "superblock": false, 00:22:08.142 "num_base_bdevs": 4, 00:22:08.142 "num_base_bdevs_discovered": 2, 00:22:08.142 "num_base_bdevs_operational": 4, 00:22:08.142 "base_bdevs_list": [ 00:22:08.142 { 00:22:08.142 "name": null, 00:22:08.142 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:08.142 "is_configured": false, 00:22:08.142 "data_offset": 0, 00:22:08.142 "data_size": 65536 00:22:08.142 }, 00:22:08.142 { 00:22:08.142 "name": null, 00:22:08.142 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:08.142 "is_configured": false, 00:22:08.142 "data_offset": 0, 00:22:08.142 "data_size": 65536 00:22:08.142 }, 00:22:08.142 { 00:22:08.142 "name": "BaseBdev3", 00:22:08.142 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:08.142 "is_configured": true, 00:22:08.142 "data_offset": 0, 00:22:08.142 "data_size": 65536 00:22:08.142 }, 00:22:08.142 { 00:22:08.142 "name": "BaseBdev4", 00:22:08.142 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:08.142 "is_configured": true, 00:22:08.142 "data_offset": 0, 00:22:08.142 "data_size": 65536 00:22:08.142 } 00:22:08.142 ] 00:22:08.142 }' 00:22:08.142 22:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.142 22:50:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:09.075 22:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:09.075 22:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.075 22:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:09.075 22:50:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:09.333 [2024-07-15 22:50:54.116603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.333 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:09.591 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.591 "name": "Existed_Raid", 00:22:09.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.591 "strip_size_kb": 0, 00:22:09.591 "state": "configuring", 00:22:09.591 "raid_level": "raid1", 00:22:09.591 "superblock": false, 00:22:09.591 "num_base_bdevs": 4, 00:22:09.591 "num_base_bdevs_discovered": 3, 00:22:09.592 "num_base_bdevs_operational": 4, 00:22:09.592 "base_bdevs_list": [ 00:22:09.592 { 00:22:09.592 "name": null, 00:22:09.592 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:09.592 "is_configured": false, 00:22:09.592 "data_offset": 0, 00:22:09.592 "data_size": 65536 00:22:09.592 }, 00:22:09.592 { 00:22:09.592 "name": "BaseBdev2", 00:22:09.592 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:09.592 "is_configured": true, 00:22:09.592 "data_offset": 0, 00:22:09.592 "data_size": 65536 00:22:09.592 }, 00:22:09.592 { 00:22:09.592 "name": "BaseBdev3", 00:22:09.592 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:09.592 "is_configured": true, 00:22:09.592 "data_offset": 0, 00:22:09.592 "data_size": 65536 00:22:09.592 }, 00:22:09.592 { 00:22:09.592 "name": "BaseBdev4", 00:22:09.592 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:09.592 "is_configured": true, 00:22:09.592 "data_offset": 0, 00:22:09.592 "data_size": 65536 00:22:09.592 } 00:22:09.592 ] 00:22:09.592 }' 00:22:09.592 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.592 22:50:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.158 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.158 22:50:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:10.416 22:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:10.416 22:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.416 22:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:10.675 22:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 74c20b26-03ff-4056-bf20-c2b4aa5fb6f8 00:22:10.933 [2024-07-15 22:50:55.724179] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:10.933 [2024-07-15 22:50:55.724221] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2562610 00:22:10.933 [2024-07-15 22:50:55.724229] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:10.933 [2024-07-15 22:50:55.724421] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2563a70 00:22:10.933 [2024-07-15 22:50:55.724545] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2562610 00:22:10.933 [2024-07-15 22:50:55.724555] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2562610 00:22:10.933 [2024-07-15 22:50:55.724717] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.933 NewBaseBdev 00:22:10.933 22:50:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:10.933 22:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:10.933 22:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:10.933 22:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:10.933 22:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:10.933 22:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:10.933 22:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:11.192 22:50:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:11.450 [ 00:22:11.450 { 00:22:11.451 "name": "NewBaseBdev", 00:22:11.451 "aliases": [ 00:22:11.451 "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8" 00:22:11.451 ], 00:22:11.451 "product_name": "Malloc disk", 00:22:11.451 "block_size": 512, 00:22:11.451 "num_blocks": 65536, 00:22:11.451 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:11.451 "assigned_rate_limits": { 00:22:11.451 "rw_ios_per_sec": 0, 00:22:11.451 "rw_mbytes_per_sec": 0, 00:22:11.451 "r_mbytes_per_sec": 0, 00:22:11.451 "w_mbytes_per_sec": 0 00:22:11.451 }, 00:22:11.451 "claimed": true, 00:22:11.451 "claim_type": "exclusive_write", 00:22:11.451 "zoned": false, 00:22:11.451 "supported_io_types": { 00:22:11.451 "read": true, 00:22:11.451 "write": true, 00:22:11.451 "unmap": true, 00:22:11.451 "flush": true, 00:22:11.451 "reset": true, 00:22:11.451 "nvme_admin": false, 00:22:11.451 "nvme_io": false, 00:22:11.451 "nvme_io_md": false, 00:22:11.451 "write_zeroes": true, 00:22:11.451 "zcopy": true, 00:22:11.451 "get_zone_info": false, 00:22:11.451 "zone_management": false, 00:22:11.451 "zone_append": false, 00:22:11.451 "compare": false, 00:22:11.451 "compare_and_write": false, 00:22:11.451 "abort": true, 00:22:11.451 "seek_hole": false, 00:22:11.451 "seek_data": false, 00:22:11.451 "copy": true, 00:22:11.451 "nvme_iov_md": false 00:22:11.451 }, 00:22:11.451 "memory_domains": [ 00:22:11.451 { 00:22:11.451 "dma_device_id": "system", 00:22:11.451 "dma_device_type": 1 00:22:11.451 }, 00:22:11.451 { 00:22:11.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:11.451 "dma_device_type": 2 00:22:11.451 } 00:22:11.451 ], 00:22:11.451 "driver_specific": {} 00:22:11.451 } 00:22:11.451 ] 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.451 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:11.710 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:11.710 "name": "Existed_Raid", 00:22:11.710 "uuid": "bb69b0c9-db68-417b-bc3e-9d92008b8c6b", 00:22:11.710 "strip_size_kb": 0, 00:22:11.710 "state": "online", 00:22:11.710 "raid_level": "raid1", 00:22:11.710 "superblock": false, 00:22:11.710 "num_base_bdevs": 4, 00:22:11.710 "num_base_bdevs_discovered": 4, 00:22:11.710 "num_base_bdevs_operational": 4, 00:22:11.710 "base_bdevs_list": [ 00:22:11.710 { 00:22:11.710 "name": "NewBaseBdev", 00:22:11.710 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:11.710 "is_configured": true, 00:22:11.710 "data_offset": 0, 00:22:11.710 "data_size": 65536 00:22:11.710 }, 00:22:11.710 { 00:22:11.710 "name": "BaseBdev2", 00:22:11.710 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:11.710 "is_configured": true, 00:22:11.710 "data_offset": 0, 00:22:11.710 "data_size": 65536 00:22:11.710 }, 00:22:11.710 { 00:22:11.710 "name": "BaseBdev3", 00:22:11.710 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:11.710 "is_configured": true, 00:22:11.710 "data_offset": 0, 00:22:11.710 "data_size": 65536 00:22:11.710 }, 00:22:11.710 { 00:22:11.710 "name": "BaseBdev4", 00:22:11.710 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:11.710 "is_configured": true, 00:22:11.711 "data_offset": 0, 00:22:11.711 "data_size": 65536 00:22:11.711 } 00:22:11.711 ] 00:22:11.711 }' 00:22:11.711 22:50:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:11.711 22:50:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:12.304 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:12.629 [2024-07-15 22:50:57.296669] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:12.629 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:12.629 "name": "Existed_Raid", 00:22:12.629 "aliases": [ 00:22:12.629 "bb69b0c9-db68-417b-bc3e-9d92008b8c6b" 00:22:12.629 ], 00:22:12.629 "product_name": "Raid Volume", 00:22:12.629 "block_size": 512, 00:22:12.629 "num_blocks": 65536, 00:22:12.629 "uuid": "bb69b0c9-db68-417b-bc3e-9d92008b8c6b", 00:22:12.629 "assigned_rate_limits": { 00:22:12.629 "rw_ios_per_sec": 0, 00:22:12.629 "rw_mbytes_per_sec": 0, 00:22:12.629 "r_mbytes_per_sec": 0, 00:22:12.629 "w_mbytes_per_sec": 0 00:22:12.629 }, 00:22:12.629 "claimed": false, 00:22:12.629 "zoned": false, 00:22:12.629 "supported_io_types": { 00:22:12.629 "read": true, 00:22:12.629 "write": true, 00:22:12.629 "unmap": false, 00:22:12.629 "flush": false, 00:22:12.629 "reset": true, 00:22:12.629 "nvme_admin": false, 00:22:12.629 "nvme_io": false, 00:22:12.629 "nvme_io_md": false, 00:22:12.629 "write_zeroes": true, 00:22:12.629 "zcopy": false, 00:22:12.629 "get_zone_info": false, 00:22:12.629 "zone_management": false, 00:22:12.629 "zone_append": false, 00:22:12.629 "compare": false, 00:22:12.629 "compare_and_write": false, 00:22:12.629 "abort": false, 00:22:12.629 "seek_hole": false, 00:22:12.629 "seek_data": false, 00:22:12.629 "copy": false, 00:22:12.629 "nvme_iov_md": false 00:22:12.629 }, 00:22:12.629 "memory_domains": [ 00:22:12.629 { 00:22:12.629 "dma_device_id": "system", 00:22:12.629 "dma_device_type": 1 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.629 "dma_device_type": 2 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "dma_device_id": "system", 00:22:12.629 "dma_device_type": 1 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.629 "dma_device_type": 2 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "dma_device_id": "system", 00:22:12.629 "dma_device_type": 1 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.629 "dma_device_type": 2 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "dma_device_id": "system", 00:22:12.629 "dma_device_type": 1 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.629 "dma_device_type": 2 00:22:12.629 } 00:22:12.629 ], 00:22:12.629 "driver_specific": { 00:22:12.629 "raid": { 00:22:12.629 "uuid": "bb69b0c9-db68-417b-bc3e-9d92008b8c6b", 00:22:12.629 "strip_size_kb": 0, 00:22:12.629 "state": "online", 00:22:12.629 "raid_level": "raid1", 00:22:12.629 "superblock": false, 00:22:12.629 "num_base_bdevs": 4, 00:22:12.629 "num_base_bdevs_discovered": 4, 00:22:12.629 "num_base_bdevs_operational": 4, 00:22:12.629 "base_bdevs_list": [ 00:22:12.629 { 00:22:12.629 "name": "NewBaseBdev", 00:22:12.629 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:12.629 "is_configured": true, 00:22:12.629 "data_offset": 0, 00:22:12.629 "data_size": 65536 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "name": "BaseBdev2", 00:22:12.629 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:12.629 "is_configured": true, 00:22:12.629 "data_offset": 0, 00:22:12.629 "data_size": 65536 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "name": "BaseBdev3", 00:22:12.629 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:12.629 "is_configured": true, 00:22:12.629 "data_offset": 0, 00:22:12.629 "data_size": 65536 00:22:12.629 }, 00:22:12.629 { 00:22:12.629 "name": "BaseBdev4", 00:22:12.629 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:12.629 "is_configured": true, 00:22:12.629 "data_offset": 0, 00:22:12.629 "data_size": 65536 00:22:12.629 } 00:22:12.629 ] 00:22:12.629 } 00:22:12.629 } 00:22:12.629 }' 00:22:12.629 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:12.629 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:12.629 BaseBdev2 00:22:12.629 BaseBdev3 00:22:12.629 BaseBdev4' 00:22:12.629 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:12.629 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:12.629 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:12.888 "name": "NewBaseBdev", 00:22:12.888 "aliases": [ 00:22:12.888 "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8" 00:22:12.888 ], 00:22:12.888 "product_name": "Malloc disk", 00:22:12.888 "block_size": 512, 00:22:12.888 "num_blocks": 65536, 00:22:12.888 "uuid": "74c20b26-03ff-4056-bf20-c2b4aa5fb6f8", 00:22:12.888 "assigned_rate_limits": { 00:22:12.888 "rw_ios_per_sec": 0, 00:22:12.888 "rw_mbytes_per_sec": 0, 00:22:12.888 "r_mbytes_per_sec": 0, 00:22:12.888 "w_mbytes_per_sec": 0 00:22:12.888 }, 00:22:12.888 "claimed": true, 00:22:12.888 "claim_type": "exclusive_write", 00:22:12.888 "zoned": false, 00:22:12.888 "supported_io_types": { 00:22:12.888 "read": true, 00:22:12.888 "write": true, 00:22:12.888 "unmap": true, 00:22:12.888 "flush": true, 00:22:12.888 "reset": true, 00:22:12.888 "nvme_admin": false, 00:22:12.888 "nvme_io": false, 00:22:12.888 "nvme_io_md": false, 00:22:12.888 "write_zeroes": true, 00:22:12.888 "zcopy": true, 00:22:12.888 "get_zone_info": false, 00:22:12.888 "zone_management": false, 00:22:12.888 "zone_append": false, 00:22:12.888 "compare": false, 00:22:12.888 "compare_and_write": false, 00:22:12.888 "abort": true, 00:22:12.888 "seek_hole": false, 00:22:12.888 "seek_data": false, 00:22:12.888 "copy": true, 00:22:12.888 "nvme_iov_md": false 00:22:12.888 }, 00:22:12.888 "memory_domains": [ 00:22:12.888 { 00:22:12.888 "dma_device_id": "system", 00:22:12.888 "dma_device_type": 1 00:22:12.888 }, 00:22:12.888 { 00:22:12.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:12.888 "dma_device_type": 2 00:22:12.888 } 00:22:12.888 ], 00:22:12.888 "driver_specific": {} 00:22:12.888 }' 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:12.888 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:13.146 22:50:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:13.406 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:13.406 "name": "BaseBdev2", 00:22:13.406 "aliases": [ 00:22:13.406 "5f9badb4-ed67-4be4-a49d-b034c1af460c" 00:22:13.406 ], 00:22:13.406 "product_name": "Malloc disk", 00:22:13.406 "block_size": 512, 00:22:13.406 "num_blocks": 65536, 00:22:13.406 "uuid": "5f9badb4-ed67-4be4-a49d-b034c1af460c", 00:22:13.406 "assigned_rate_limits": { 00:22:13.406 "rw_ios_per_sec": 0, 00:22:13.406 "rw_mbytes_per_sec": 0, 00:22:13.406 "r_mbytes_per_sec": 0, 00:22:13.406 "w_mbytes_per_sec": 0 00:22:13.406 }, 00:22:13.406 "claimed": true, 00:22:13.406 "claim_type": "exclusive_write", 00:22:13.406 "zoned": false, 00:22:13.406 "supported_io_types": { 00:22:13.406 "read": true, 00:22:13.406 "write": true, 00:22:13.406 "unmap": true, 00:22:13.406 "flush": true, 00:22:13.406 "reset": true, 00:22:13.406 "nvme_admin": false, 00:22:13.406 "nvme_io": false, 00:22:13.406 "nvme_io_md": false, 00:22:13.406 "write_zeroes": true, 00:22:13.406 "zcopy": true, 00:22:13.406 "get_zone_info": false, 00:22:13.406 "zone_management": false, 00:22:13.406 "zone_append": false, 00:22:13.406 "compare": false, 00:22:13.406 "compare_and_write": false, 00:22:13.406 "abort": true, 00:22:13.406 "seek_hole": false, 00:22:13.406 "seek_data": false, 00:22:13.406 "copy": true, 00:22:13.406 "nvme_iov_md": false 00:22:13.406 }, 00:22:13.406 "memory_domains": [ 00:22:13.406 { 00:22:13.406 "dma_device_id": "system", 00:22:13.406 "dma_device_type": 1 00:22:13.406 }, 00:22:13.406 { 00:22:13.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:13.406 "dma_device_type": 2 00:22:13.406 } 00:22:13.406 ], 00:22:13.406 "driver_specific": {} 00:22:13.406 }' 00:22:13.406 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:13.406 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:13.406 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:13.406 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:13.664 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:13.923 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:13.923 "name": "BaseBdev3", 00:22:13.923 "aliases": [ 00:22:13.923 "a033b197-80c9-4801-a91e-0339a68cf582" 00:22:13.923 ], 00:22:13.923 "product_name": "Malloc disk", 00:22:13.923 "block_size": 512, 00:22:13.923 "num_blocks": 65536, 00:22:13.923 "uuid": "a033b197-80c9-4801-a91e-0339a68cf582", 00:22:13.923 "assigned_rate_limits": { 00:22:13.923 "rw_ios_per_sec": 0, 00:22:13.923 "rw_mbytes_per_sec": 0, 00:22:13.923 "r_mbytes_per_sec": 0, 00:22:13.923 "w_mbytes_per_sec": 0 00:22:13.923 }, 00:22:13.923 "claimed": true, 00:22:13.923 "claim_type": "exclusive_write", 00:22:13.923 "zoned": false, 00:22:13.923 "supported_io_types": { 00:22:13.923 "read": true, 00:22:13.923 "write": true, 00:22:13.923 "unmap": true, 00:22:13.923 "flush": true, 00:22:13.923 "reset": true, 00:22:13.923 "nvme_admin": false, 00:22:13.923 "nvme_io": false, 00:22:13.923 "nvme_io_md": false, 00:22:13.923 "write_zeroes": true, 00:22:13.923 "zcopy": true, 00:22:13.923 "get_zone_info": false, 00:22:13.923 "zone_management": false, 00:22:13.923 "zone_append": false, 00:22:13.923 "compare": false, 00:22:13.923 "compare_and_write": false, 00:22:13.923 "abort": true, 00:22:13.923 "seek_hole": false, 00:22:13.923 "seek_data": false, 00:22:13.923 "copy": true, 00:22:13.923 "nvme_iov_md": false 00:22:13.923 }, 00:22:13.923 "memory_domains": [ 00:22:13.923 { 00:22:13.923 "dma_device_id": "system", 00:22:13.923 "dma_device_type": 1 00:22:13.923 }, 00:22:13.923 { 00:22:13.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:13.923 "dma_device_type": 2 00:22:13.923 } 00:22:13.923 ], 00:22:13.923 "driver_specific": {} 00:22:13.923 }' 00:22:13.923 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.180 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.180 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:14.180 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.180 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.180 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:14.180 22:50:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.180 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.180 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:14.180 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.438 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.439 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:14.439 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:14.439 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:14.439 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:14.696 "name": "BaseBdev4", 00:22:14.696 "aliases": [ 00:22:14.696 "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e" 00:22:14.696 ], 00:22:14.696 "product_name": "Malloc disk", 00:22:14.696 "block_size": 512, 00:22:14.696 "num_blocks": 65536, 00:22:14.696 "uuid": "fd05c6b0-bb8b-4ae3-bcf2-da9ec4a5284e", 00:22:14.696 "assigned_rate_limits": { 00:22:14.696 "rw_ios_per_sec": 0, 00:22:14.696 "rw_mbytes_per_sec": 0, 00:22:14.696 "r_mbytes_per_sec": 0, 00:22:14.696 "w_mbytes_per_sec": 0 00:22:14.696 }, 00:22:14.696 "claimed": true, 00:22:14.696 "claim_type": "exclusive_write", 00:22:14.696 "zoned": false, 00:22:14.696 "supported_io_types": { 00:22:14.696 "read": true, 00:22:14.696 "write": true, 00:22:14.696 "unmap": true, 00:22:14.696 "flush": true, 00:22:14.696 "reset": true, 00:22:14.696 "nvme_admin": false, 00:22:14.696 "nvme_io": false, 00:22:14.696 "nvme_io_md": false, 00:22:14.696 "write_zeroes": true, 00:22:14.696 "zcopy": true, 00:22:14.696 "get_zone_info": false, 00:22:14.696 "zone_management": false, 00:22:14.696 "zone_append": false, 00:22:14.696 "compare": false, 00:22:14.696 "compare_and_write": false, 00:22:14.696 "abort": true, 00:22:14.696 "seek_hole": false, 00:22:14.696 "seek_data": false, 00:22:14.696 "copy": true, 00:22:14.696 "nvme_iov_md": false 00:22:14.696 }, 00:22:14.696 "memory_domains": [ 00:22:14.696 { 00:22:14.696 "dma_device_id": "system", 00:22:14.696 "dma_device_type": 1 00:22:14.696 }, 00:22:14.696 { 00:22:14.696 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.696 "dma_device_type": 2 00:22:14.696 } 00:22:14.696 ], 00:22:14.696 "driver_specific": {} 00:22:14.696 }' 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:14.696 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.954 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:14.954 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:14.954 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.954 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:14.954 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:14.954 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:15.213 [2024-07-15 22:50:59.891406] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:15.213 [2024-07-15 22:50:59.891430] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:15.213 [2024-07-15 22:50:59.891482] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:15.213 [2024-07-15 22:50:59.891768] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:15.213 [2024-07-15 22:50:59.891781] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2562610 name Existed_Raid, state offline 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2791317 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2791317 ']' 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2791317 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2791317 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2791317' 00:22:15.213 killing process with pid 2791317 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2791317 00:22:15.213 [2024-07-15 22:50:59.965346] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:15.213 22:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2791317 00:22:15.213 [2024-07-15 22:51:00.002773] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:15.473 00:22:15.473 real 0m32.894s 00:22:15.473 user 1m0.287s 00:22:15.473 sys 0m5.945s 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:15.473 ************************************ 00:22:15.473 END TEST raid_state_function_test 00:22:15.473 ************************************ 00:22:15.473 22:51:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:15.473 22:51:00 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:22:15.473 22:51:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:15.473 22:51:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:15.473 22:51:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:15.473 ************************************ 00:22:15.473 START TEST raid_state_function_test_sb 00:22:15.473 ************************************ 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2796255 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2796255' 00:22:15.473 Process raid pid: 2796255 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2796255 /var/tmp/spdk-raid.sock 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2796255 ']' 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:15.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:15.473 22:51:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:15.732 [2024-07-15 22:51:00.416718] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:22:15.732 [2024-07-15 22:51:00.416861] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:15.732 [2024-07-15 22:51:00.617067] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:15.989 [2024-07-15 22:51:00.724819] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:15.989 [2024-07-15 22:51:00.792941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:15.989 [2024-07-15 22:51:00.792978] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:16.578 22:51:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:16.578 22:51:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:16.578 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:17.143 [2024-07-15 22:51:01.848368] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:17.143 [2024-07-15 22:51:01.848410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:17.143 [2024-07-15 22:51:01.848421] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:17.143 [2024-07-15 22:51:01.848434] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:17.143 [2024-07-15 22:51:01.848443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:17.143 [2024-07-15 22:51:01.848454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:17.143 [2024-07-15 22:51:01.848463] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:17.143 [2024-07-15 22:51:01.848475] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.143 22:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:17.401 22:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.401 "name": "Existed_Raid", 00:22:17.401 "uuid": "c31f7ce0-2def-4397-848b-9e2c14811835", 00:22:17.401 "strip_size_kb": 0, 00:22:17.401 "state": "configuring", 00:22:17.401 "raid_level": "raid1", 00:22:17.401 "superblock": true, 00:22:17.401 "num_base_bdevs": 4, 00:22:17.401 "num_base_bdevs_discovered": 0, 00:22:17.401 "num_base_bdevs_operational": 4, 00:22:17.401 "base_bdevs_list": [ 00:22:17.401 { 00:22:17.401 "name": "BaseBdev1", 00:22:17.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.401 "is_configured": false, 00:22:17.401 "data_offset": 0, 00:22:17.401 "data_size": 0 00:22:17.401 }, 00:22:17.401 { 00:22:17.401 "name": "BaseBdev2", 00:22:17.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.401 "is_configured": false, 00:22:17.401 "data_offset": 0, 00:22:17.401 "data_size": 0 00:22:17.401 }, 00:22:17.401 { 00:22:17.401 "name": "BaseBdev3", 00:22:17.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.401 "is_configured": false, 00:22:17.401 "data_offset": 0, 00:22:17.401 "data_size": 0 00:22:17.401 }, 00:22:17.401 { 00:22:17.401 "name": "BaseBdev4", 00:22:17.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.401 "is_configured": false, 00:22:17.401 "data_offset": 0, 00:22:17.401 "data_size": 0 00:22:17.401 } 00:22:17.401 ] 00:22:17.401 }' 00:22:17.401 22:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.401 22:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:17.975 22:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:18.232 [2024-07-15 22:51:02.943113] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:18.232 [2024-07-15 22:51:02.943139] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12dfaa0 name Existed_Raid, state configuring 00:22:18.232 22:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:18.490 [2024-07-15 22:51:03.191802] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:18.490 [2024-07-15 22:51:03.191829] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:18.490 [2024-07-15 22:51:03.191838] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:18.490 [2024-07-15 22:51:03.191849] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:18.490 [2024-07-15 22:51:03.191858] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:18.490 [2024-07-15 22:51:03.191870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:18.490 [2024-07-15 22:51:03.191879] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:18.490 [2024-07-15 22:51:03.191890] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:18.490 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:18.747 [2024-07-15 22:51:03.446211] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:18.747 BaseBdev1 00:22:18.747 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:18.747 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:18.747 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:18.747 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:18.747 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:18.747 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:18.747 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:19.005 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:19.264 [ 00:22:19.264 { 00:22:19.264 "name": "BaseBdev1", 00:22:19.264 "aliases": [ 00:22:19.264 "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7" 00:22:19.264 ], 00:22:19.264 "product_name": "Malloc disk", 00:22:19.264 "block_size": 512, 00:22:19.264 "num_blocks": 65536, 00:22:19.264 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:19.264 "assigned_rate_limits": { 00:22:19.264 "rw_ios_per_sec": 0, 00:22:19.264 "rw_mbytes_per_sec": 0, 00:22:19.264 "r_mbytes_per_sec": 0, 00:22:19.264 "w_mbytes_per_sec": 0 00:22:19.264 }, 00:22:19.264 "claimed": true, 00:22:19.264 "claim_type": "exclusive_write", 00:22:19.264 "zoned": false, 00:22:19.264 "supported_io_types": { 00:22:19.264 "read": true, 00:22:19.264 "write": true, 00:22:19.264 "unmap": true, 00:22:19.264 "flush": true, 00:22:19.264 "reset": true, 00:22:19.264 "nvme_admin": false, 00:22:19.264 "nvme_io": false, 00:22:19.264 "nvme_io_md": false, 00:22:19.264 "write_zeroes": true, 00:22:19.264 "zcopy": true, 00:22:19.264 "get_zone_info": false, 00:22:19.264 "zone_management": false, 00:22:19.264 "zone_append": false, 00:22:19.264 "compare": false, 00:22:19.264 "compare_and_write": false, 00:22:19.264 "abort": true, 00:22:19.264 "seek_hole": false, 00:22:19.264 "seek_data": false, 00:22:19.264 "copy": true, 00:22:19.264 "nvme_iov_md": false 00:22:19.264 }, 00:22:19.264 "memory_domains": [ 00:22:19.264 { 00:22:19.264 "dma_device_id": "system", 00:22:19.264 "dma_device_type": 1 00:22:19.264 }, 00:22:19.264 { 00:22:19.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:19.264 "dma_device_type": 2 00:22:19.264 } 00:22:19.264 ], 00:22:19.264 "driver_specific": {} 00:22:19.264 } 00:22:19.264 ] 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:19.264 22:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:19.522 22:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:19.522 "name": "Existed_Raid", 00:22:19.522 "uuid": "9ecdaad5-62a4-483d-a720-93384ff72c67", 00:22:19.522 "strip_size_kb": 0, 00:22:19.522 "state": "configuring", 00:22:19.522 "raid_level": "raid1", 00:22:19.522 "superblock": true, 00:22:19.522 "num_base_bdevs": 4, 00:22:19.522 "num_base_bdevs_discovered": 1, 00:22:19.522 "num_base_bdevs_operational": 4, 00:22:19.522 "base_bdevs_list": [ 00:22:19.522 { 00:22:19.522 "name": "BaseBdev1", 00:22:19.522 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:19.522 "is_configured": true, 00:22:19.522 "data_offset": 2048, 00:22:19.522 "data_size": 63488 00:22:19.522 }, 00:22:19.522 { 00:22:19.522 "name": "BaseBdev2", 00:22:19.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.522 "is_configured": false, 00:22:19.522 "data_offset": 0, 00:22:19.522 "data_size": 0 00:22:19.522 }, 00:22:19.522 { 00:22:19.522 "name": "BaseBdev3", 00:22:19.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.522 "is_configured": false, 00:22:19.522 "data_offset": 0, 00:22:19.522 "data_size": 0 00:22:19.522 }, 00:22:19.522 { 00:22:19.522 "name": "BaseBdev4", 00:22:19.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:19.522 "is_configured": false, 00:22:19.522 "data_offset": 0, 00:22:19.522 "data_size": 0 00:22:19.522 } 00:22:19.522 ] 00:22:19.522 }' 00:22:19.522 22:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:19.522 22:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:20.090 22:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:20.348 [2024-07-15 22:51:05.070513] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:20.348 [2024-07-15 22:51:05.070557] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12df310 name Existed_Raid, state configuring 00:22:20.348 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:20.607 [2024-07-15 22:51:05.323218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:20.607 [2024-07-15 22:51:05.324762] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:20.607 [2024-07-15 22:51:05.324795] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:20.607 [2024-07-15 22:51:05.324806] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:20.607 [2024-07-15 22:51:05.324818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:20.607 [2024-07-15 22:51:05.324827] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:20.607 [2024-07-15 22:51:05.324838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.607 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:20.881 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.881 "name": "Existed_Raid", 00:22:20.881 "uuid": "fd2af692-01ff-412e-8f2d-f615eceafe14", 00:22:20.881 "strip_size_kb": 0, 00:22:20.881 "state": "configuring", 00:22:20.881 "raid_level": "raid1", 00:22:20.881 "superblock": true, 00:22:20.881 "num_base_bdevs": 4, 00:22:20.881 "num_base_bdevs_discovered": 1, 00:22:20.881 "num_base_bdevs_operational": 4, 00:22:20.881 "base_bdevs_list": [ 00:22:20.881 { 00:22:20.881 "name": "BaseBdev1", 00:22:20.881 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:20.881 "is_configured": true, 00:22:20.881 "data_offset": 2048, 00:22:20.881 "data_size": 63488 00:22:20.881 }, 00:22:20.881 { 00:22:20.881 "name": "BaseBdev2", 00:22:20.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.881 "is_configured": false, 00:22:20.881 "data_offset": 0, 00:22:20.881 "data_size": 0 00:22:20.881 }, 00:22:20.881 { 00:22:20.881 "name": "BaseBdev3", 00:22:20.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.881 "is_configured": false, 00:22:20.881 "data_offset": 0, 00:22:20.881 "data_size": 0 00:22:20.881 }, 00:22:20.881 { 00:22:20.881 "name": "BaseBdev4", 00:22:20.881 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.881 "is_configured": false, 00:22:20.881 "data_offset": 0, 00:22:20.881 "data_size": 0 00:22:20.881 } 00:22:20.881 ] 00:22:20.881 }' 00:22:20.881 22:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.881 22:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:21.448 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:21.706 [2024-07-15 22:51:06.421496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:21.706 BaseBdev2 00:22:21.706 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:21.706 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:21.706 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:21.706 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:21.706 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:21.706 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:21.706 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:21.965 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:22.224 [ 00:22:22.224 { 00:22:22.224 "name": "BaseBdev2", 00:22:22.224 "aliases": [ 00:22:22.224 "422597d0-89f6-4de9-8c62-60b86f172a22" 00:22:22.224 ], 00:22:22.224 "product_name": "Malloc disk", 00:22:22.224 "block_size": 512, 00:22:22.224 "num_blocks": 65536, 00:22:22.224 "uuid": "422597d0-89f6-4de9-8c62-60b86f172a22", 00:22:22.224 "assigned_rate_limits": { 00:22:22.224 "rw_ios_per_sec": 0, 00:22:22.224 "rw_mbytes_per_sec": 0, 00:22:22.224 "r_mbytes_per_sec": 0, 00:22:22.224 "w_mbytes_per_sec": 0 00:22:22.224 }, 00:22:22.224 "claimed": true, 00:22:22.224 "claim_type": "exclusive_write", 00:22:22.224 "zoned": false, 00:22:22.224 "supported_io_types": { 00:22:22.224 "read": true, 00:22:22.224 "write": true, 00:22:22.224 "unmap": true, 00:22:22.224 "flush": true, 00:22:22.224 "reset": true, 00:22:22.224 "nvme_admin": false, 00:22:22.224 "nvme_io": false, 00:22:22.224 "nvme_io_md": false, 00:22:22.224 "write_zeroes": true, 00:22:22.224 "zcopy": true, 00:22:22.224 "get_zone_info": false, 00:22:22.224 "zone_management": false, 00:22:22.224 "zone_append": false, 00:22:22.224 "compare": false, 00:22:22.224 "compare_and_write": false, 00:22:22.224 "abort": true, 00:22:22.224 "seek_hole": false, 00:22:22.224 "seek_data": false, 00:22:22.224 "copy": true, 00:22:22.224 "nvme_iov_md": false 00:22:22.224 }, 00:22:22.224 "memory_domains": [ 00:22:22.224 { 00:22:22.224 "dma_device_id": "system", 00:22:22.224 "dma_device_type": 1 00:22:22.224 }, 00:22:22.224 { 00:22:22.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:22.224 "dma_device_type": 2 00:22:22.224 } 00:22:22.224 ], 00:22:22.224 "driver_specific": {} 00:22:22.224 } 00:22:22.224 ] 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.224 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.225 22:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:22.483 22:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.483 "name": "Existed_Raid", 00:22:22.483 "uuid": "fd2af692-01ff-412e-8f2d-f615eceafe14", 00:22:22.483 "strip_size_kb": 0, 00:22:22.483 "state": "configuring", 00:22:22.483 "raid_level": "raid1", 00:22:22.483 "superblock": true, 00:22:22.483 "num_base_bdevs": 4, 00:22:22.483 "num_base_bdevs_discovered": 2, 00:22:22.483 "num_base_bdevs_operational": 4, 00:22:22.483 "base_bdevs_list": [ 00:22:22.483 { 00:22:22.483 "name": "BaseBdev1", 00:22:22.483 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:22.483 "is_configured": true, 00:22:22.483 "data_offset": 2048, 00:22:22.483 "data_size": 63488 00:22:22.483 }, 00:22:22.483 { 00:22:22.483 "name": "BaseBdev2", 00:22:22.483 "uuid": "422597d0-89f6-4de9-8c62-60b86f172a22", 00:22:22.483 "is_configured": true, 00:22:22.483 "data_offset": 2048, 00:22:22.483 "data_size": 63488 00:22:22.483 }, 00:22:22.483 { 00:22:22.483 "name": "BaseBdev3", 00:22:22.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.483 "is_configured": false, 00:22:22.483 "data_offset": 0, 00:22:22.483 "data_size": 0 00:22:22.483 }, 00:22:22.483 { 00:22:22.483 "name": "BaseBdev4", 00:22:22.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.483 "is_configured": false, 00:22:22.483 "data_offset": 0, 00:22:22.483 "data_size": 0 00:22:22.483 } 00:22:22.483 ] 00:22:22.483 }' 00:22:22.483 22:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.483 22:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.051 22:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:23.051 [2024-07-15 22:51:07.956904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:23.051 BaseBdev3 00:22:23.309 22:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:23.309 22:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:23.309 22:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:23.309 22:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:23.309 22:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:23.309 22:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:23.309 22:51:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:23.309 22:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:23.567 [ 00:22:23.567 { 00:22:23.567 "name": "BaseBdev3", 00:22:23.567 "aliases": [ 00:22:23.567 "f633012e-b219-42b3-adc2-036c777717a7" 00:22:23.567 ], 00:22:23.567 "product_name": "Malloc disk", 00:22:23.567 "block_size": 512, 00:22:23.567 "num_blocks": 65536, 00:22:23.567 "uuid": "f633012e-b219-42b3-adc2-036c777717a7", 00:22:23.567 "assigned_rate_limits": { 00:22:23.567 "rw_ios_per_sec": 0, 00:22:23.567 "rw_mbytes_per_sec": 0, 00:22:23.567 "r_mbytes_per_sec": 0, 00:22:23.567 "w_mbytes_per_sec": 0 00:22:23.567 }, 00:22:23.567 "claimed": true, 00:22:23.567 "claim_type": "exclusive_write", 00:22:23.567 "zoned": false, 00:22:23.567 "supported_io_types": { 00:22:23.567 "read": true, 00:22:23.567 "write": true, 00:22:23.567 "unmap": true, 00:22:23.567 "flush": true, 00:22:23.567 "reset": true, 00:22:23.567 "nvme_admin": false, 00:22:23.567 "nvme_io": false, 00:22:23.567 "nvme_io_md": false, 00:22:23.567 "write_zeroes": true, 00:22:23.567 "zcopy": true, 00:22:23.567 "get_zone_info": false, 00:22:23.567 "zone_management": false, 00:22:23.567 "zone_append": false, 00:22:23.567 "compare": false, 00:22:23.567 "compare_and_write": false, 00:22:23.567 "abort": true, 00:22:23.567 "seek_hole": false, 00:22:23.567 "seek_data": false, 00:22:23.567 "copy": true, 00:22:23.567 "nvme_iov_md": false 00:22:23.567 }, 00:22:23.567 "memory_domains": [ 00:22:23.567 { 00:22:23.567 "dma_device_id": "system", 00:22:23.567 "dma_device_type": 1 00:22:23.567 }, 00:22:23.567 { 00:22:23.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.567 "dma_device_type": 2 00:22:23.567 } 00:22:23.567 ], 00:22:23.567 "driver_specific": {} 00:22:23.567 } 00:22:23.567 ] 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.567 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:23.825 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:23.825 "name": "Existed_Raid", 00:22:23.825 "uuid": "fd2af692-01ff-412e-8f2d-f615eceafe14", 00:22:23.825 "strip_size_kb": 0, 00:22:23.825 "state": "configuring", 00:22:23.825 "raid_level": "raid1", 00:22:23.825 "superblock": true, 00:22:23.825 "num_base_bdevs": 4, 00:22:23.825 "num_base_bdevs_discovered": 3, 00:22:23.825 "num_base_bdevs_operational": 4, 00:22:23.825 "base_bdevs_list": [ 00:22:23.825 { 00:22:23.825 "name": "BaseBdev1", 00:22:23.825 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:23.825 "is_configured": true, 00:22:23.825 "data_offset": 2048, 00:22:23.825 "data_size": 63488 00:22:23.825 }, 00:22:23.825 { 00:22:23.825 "name": "BaseBdev2", 00:22:23.825 "uuid": "422597d0-89f6-4de9-8c62-60b86f172a22", 00:22:23.825 "is_configured": true, 00:22:23.825 "data_offset": 2048, 00:22:23.825 "data_size": 63488 00:22:23.825 }, 00:22:23.825 { 00:22:23.825 "name": "BaseBdev3", 00:22:23.825 "uuid": "f633012e-b219-42b3-adc2-036c777717a7", 00:22:23.825 "is_configured": true, 00:22:23.825 "data_offset": 2048, 00:22:23.825 "data_size": 63488 00:22:23.825 }, 00:22:23.825 { 00:22:23.825 "name": "BaseBdev4", 00:22:23.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.825 "is_configured": false, 00:22:23.825 "data_offset": 0, 00:22:23.825 "data_size": 0 00:22:23.825 } 00:22:23.825 ] 00:22:23.825 }' 00:22:23.825 22:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:23.825 22:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:24.392 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:24.651 [2024-07-15 22:51:09.315912] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:24.651 [2024-07-15 22:51:09.316094] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12e0350 00:22:24.651 [2024-07-15 22:51:09.316108] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:24.651 [2024-07-15 22:51:09.316282] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12e0020 00:22:24.651 [2024-07-15 22:51:09.316407] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12e0350 00:22:24.651 [2024-07-15 22:51:09.316417] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12e0350 00:22:24.651 [2024-07-15 22:51:09.316510] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:24.651 BaseBdev4 00:22:24.651 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:24.651 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:24.651 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:24.651 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:24.651 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:24.651 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:24.651 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:24.908 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:24.908 [ 00:22:24.908 { 00:22:24.908 "name": "BaseBdev4", 00:22:24.908 "aliases": [ 00:22:24.908 "ceee1cb7-56a4-4c4e-8e1a-03246f73fb75" 00:22:24.908 ], 00:22:24.908 "product_name": "Malloc disk", 00:22:24.908 "block_size": 512, 00:22:24.908 "num_blocks": 65536, 00:22:24.908 "uuid": "ceee1cb7-56a4-4c4e-8e1a-03246f73fb75", 00:22:24.908 "assigned_rate_limits": { 00:22:24.908 "rw_ios_per_sec": 0, 00:22:24.908 "rw_mbytes_per_sec": 0, 00:22:24.908 "r_mbytes_per_sec": 0, 00:22:24.908 "w_mbytes_per_sec": 0 00:22:24.908 }, 00:22:24.908 "claimed": true, 00:22:24.908 "claim_type": "exclusive_write", 00:22:24.908 "zoned": false, 00:22:24.908 "supported_io_types": { 00:22:24.908 "read": true, 00:22:24.908 "write": true, 00:22:24.908 "unmap": true, 00:22:24.908 "flush": true, 00:22:24.908 "reset": true, 00:22:24.908 "nvme_admin": false, 00:22:24.908 "nvme_io": false, 00:22:24.908 "nvme_io_md": false, 00:22:24.908 "write_zeroes": true, 00:22:24.908 "zcopy": true, 00:22:24.909 "get_zone_info": false, 00:22:24.909 "zone_management": false, 00:22:24.909 "zone_append": false, 00:22:24.909 "compare": false, 00:22:24.909 "compare_and_write": false, 00:22:24.909 "abort": true, 00:22:24.909 "seek_hole": false, 00:22:24.909 "seek_data": false, 00:22:24.909 "copy": true, 00:22:24.909 "nvme_iov_md": false 00:22:24.909 }, 00:22:24.909 "memory_domains": [ 00:22:24.909 { 00:22:24.909 "dma_device_id": "system", 00:22:24.909 "dma_device_type": 1 00:22:24.909 }, 00:22:24.909 { 00:22:24.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.909 "dma_device_type": 2 00:22:24.909 } 00:22:24.909 ], 00:22:24.909 "driver_specific": {} 00:22:24.909 } 00:22:24.909 ] 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.909 22:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:25.167 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:25.167 "name": "Existed_Raid", 00:22:25.167 "uuid": "fd2af692-01ff-412e-8f2d-f615eceafe14", 00:22:25.167 "strip_size_kb": 0, 00:22:25.167 "state": "online", 00:22:25.167 "raid_level": "raid1", 00:22:25.167 "superblock": true, 00:22:25.167 "num_base_bdevs": 4, 00:22:25.167 "num_base_bdevs_discovered": 4, 00:22:25.167 "num_base_bdevs_operational": 4, 00:22:25.167 "base_bdevs_list": [ 00:22:25.167 { 00:22:25.167 "name": "BaseBdev1", 00:22:25.167 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:25.167 "is_configured": true, 00:22:25.167 "data_offset": 2048, 00:22:25.167 "data_size": 63488 00:22:25.167 }, 00:22:25.167 { 00:22:25.167 "name": "BaseBdev2", 00:22:25.167 "uuid": "422597d0-89f6-4de9-8c62-60b86f172a22", 00:22:25.167 "is_configured": true, 00:22:25.167 "data_offset": 2048, 00:22:25.167 "data_size": 63488 00:22:25.167 }, 00:22:25.167 { 00:22:25.167 "name": "BaseBdev3", 00:22:25.167 "uuid": "f633012e-b219-42b3-adc2-036c777717a7", 00:22:25.167 "is_configured": true, 00:22:25.167 "data_offset": 2048, 00:22:25.167 "data_size": 63488 00:22:25.167 }, 00:22:25.167 { 00:22:25.167 "name": "BaseBdev4", 00:22:25.167 "uuid": "ceee1cb7-56a4-4c4e-8e1a-03246f73fb75", 00:22:25.167 "is_configured": true, 00:22:25.167 "data_offset": 2048, 00:22:25.167 "data_size": 63488 00:22:25.167 } 00:22:25.167 ] 00:22:25.167 }' 00:22:25.167 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:25.167 22:51:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:25.734 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:25.991 [2024-07-15 22:51:10.852407] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:25.991 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:25.991 "name": "Existed_Raid", 00:22:25.991 "aliases": [ 00:22:25.991 "fd2af692-01ff-412e-8f2d-f615eceafe14" 00:22:25.991 ], 00:22:25.991 "product_name": "Raid Volume", 00:22:25.991 "block_size": 512, 00:22:25.991 "num_blocks": 63488, 00:22:25.991 "uuid": "fd2af692-01ff-412e-8f2d-f615eceafe14", 00:22:25.991 "assigned_rate_limits": { 00:22:25.991 "rw_ios_per_sec": 0, 00:22:25.991 "rw_mbytes_per_sec": 0, 00:22:25.991 "r_mbytes_per_sec": 0, 00:22:25.991 "w_mbytes_per_sec": 0 00:22:25.991 }, 00:22:25.991 "claimed": false, 00:22:25.991 "zoned": false, 00:22:25.991 "supported_io_types": { 00:22:25.991 "read": true, 00:22:25.991 "write": true, 00:22:25.991 "unmap": false, 00:22:25.991 "flush": false, 00:22:25.991 "reset": true, 00:22:25.991 "nvme_admin": false, 00:22:25.991 "nvme_io": false, 00:22:25.991 "nvme_io_md": false, 00:22:25.991 "write_zeroes": true, 00:22:25.991 "zcopy": false, 00:22:25.991 "get_zone_info": false, 00:22:25.991 "zone_management": false, 00:22:25.991 "zone_append": false, 00:22:25.991 "compare": false, 00:22:25.991 "compare_and_write": false, 00:22:25.991 "abort": false, 00:22:25.992 "seek_hole": false, 00:22:25.992 "seek_data": false, 00:22:25.992 "copy": false, 00:22:25.992 "nvme_iov_md": false 00:22:25.992 }, 00:22:25.992 "memory_domains": [ 00:22:25.992 { 00:22:25.992 "dma_device_id": "system", 00:22:25.992 "dma_device_type": 1 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.992 "dma_device_type": 2 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "dma_device_id": "system", 00:22:25.992 "dma_device_type": 1 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.992 "dma_device_type": 2 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "dma_device_id": "system", 00:22:25.992 "dma_device_type": 1 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.992 "dma_device_type": 2 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "dma_device_id": "system", 00:22:25.992 "dma_device_type": 1 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:25.992 "dma_device_type": 2 00:22:25.992 } 00:22:25.992 ], 00:22:25.992 "driver_specific": { 00:22:25.992 "raid": { 00:22:25.992 "uuid": "fd2af692-01ff-412e-8f2d-f615eceafe14", 00:22:25.992 "strip_size_kb": 0, 00:22:25.992 "state": "online", 00:22:25.992 "raid_level": "raid1", 00:22:25.992 "superblock": true, 00:22:25.992 "num_base_bdevs": 4, 00:22:25.992 "num_base_bdevs_discovered": 4, 00:22:25.992 "num_base_bdevs_operational": 4, 00:22:25.992 "base_bdevs_list": [ 00:22:25.992 { 00:22:25.992 "name": "BaseBdev1", 00:22:25.992 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:25.992 "is_configured": true, 00:22:25.992 "data_offset": 2048, 00:22:25.992 "data_size": 63488 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "name": "BaseBdev2", 00:22:25.992 "uuid": "422597d0-89f6-4de9-8c62-60b86f172a22", 00:22:25.992 "is_configured": true, 00:22:25.992 "data_offset": 2048, 00:22:25.992 "data_size": 63488 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "name": "BaseBdev3", 00:22:25.992 "uuid": "f633012e-b219-42b3-adc2-036c777717a7", 00:22:25.992 "is_configured": true, 00:22:25.992 "data_offset": 2048, 00:22:25.992 "data_size": 63488 00:22:25.992 }, 00:22:25.992 { 00:22:25.992 "name": "BaseBdev4", 00:22:25.992 "uuid": "ceee1cb7-56a4-4c4e-8e1a-03246f73fb75", 00:22:25.992 "is_configured": true, 00:22:25.992 "data_offset": 2048, 00:22:25.992 "data_size": 63488 00:22:25.992 } 00:22:25.992 ] 00:22:25.992 } 00:22:25.992 } 00:22:25.992 }' 00:22:25.992 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:26.249 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:26.249 BaseBdev2 00:22:26.249 BaseBdev3 00:22:26.249 BaseBdev4' 00:22:26.249 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:26.249 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:26.249 22:51:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:26.249 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:26.249 "name": "BaseBdev1", 00:22:26.249 "aliases": [ 00:22:26.249 "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7" 00:22:26.249 ], 00:22:26.249 "product_name": "Malloc disk", 00:22:26.249 "block_size": 512, 00:22:26.249 "num_blocks": 65536, 00:22:26.249 "uuid": "0cc90e15-ec2f-4c78-9066-a9b59d6dd6f7", 00:22:26.249 "assigned_rate_limits": { 00:22:26.249 "rw_ios_per_sec": 0, 00:22:26.249 "rw_mbytes_per_sec": 0, 00:22:26.249 "r_mbytes_per_sec": 0, 00:22:26.249 "w_mbytes_per_sec": 0 00:22:26.249 }, 00:22:26.249 "claimed": true, 00:22:26.249 "claim_type": "exclusive_write", 00:22:26.249 "zoned": false, 00:22:26.249 "supported_io_types": { 00:22:26.249 "read": true, 00:22:26.249 "write": true, 00:22:26.249 "unmap": true, 00:22:26.249 "flush": true, 00:22:26.249 "reset": true, 00:22:26.249 "nvme_admin": false, 00:22:26.249 "nvme_io": false, 00:22:26.249 "nvme_io_md": false, 00:22:26.249 "write_zeroes": true, 00:22:26.249 "zcopy": true, 00:22:26.249 "get_zone_info": false, 00:22:26.249 "zone_management": false, 00:22:26.249 "zone_append": false, 00:22:26.249 "compare": false, 00:22:26.249 "compare_and_write": false, 00:22:26.249 "abort": true, 00:22:26.249 "seek_hole": false, 00:22:26.249 "seek_data": false, 00:22:26.249 "copy": true, 00:22:26.249 "nvme_iov_md": false 00:22:26.249 }, 00:22:26.249 "memory_domains": [ 00:22:26.249 { 00:22:26.249 "dma_device_id": "system", 00:22:26.249 "dma_device_type": 1 00:22:26.249 }, 00:22:26.249 { 00:22:26.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:26.249 "dma_device_type": 2 00:22:26.250 } 00:22:26.250 ], 00:22:26.250 "driver_specific": {} 00:22:26.250 }' 00:22:26.250 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.510 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:26.510 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:26.510 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.510 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:26.510 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:26.510 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.510 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:26.813 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:26.813 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.813 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:26.813 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:26.813 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:26.813 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:26.813 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:27.073 "name": "BaseBdev2", 00:22:27.073 "aliases": [ 00:22:27.073 "422597d0-89f6-4de9-8c62-60b86f172a22" 00:22:27.073 ], 00:22:27.073 "product_name": "Malloc disk", 00:22:27.073 "block_size": 512, 00:22:27.073 "num_blocks": 65536, 00:22:27.073 "uuid": "422597d0-89f6-4de9-8c62-60b86f172a22", 00:22:27.073 "assigned_rate_limits": { 00:22:27.073 "rw_ios_per_sec": 0, 00:22:27.073 "rw_mbytes_per_sec": 0, 00:22:27.073 "r_mbytes_per_sec": 0, 00:22:27.073 "w_mbytes_per_sec": 0 00:22:27.073 }, 00:22:27.073 "claimed": true, 00:22:27.073 "claim_type": "exclusive_write", 00:22:27.073 "zoned": false, 00:22:27.073 "supported_io_types": { 00:22:27.073 "read": true, 00:22:27.073 "write": true, 00:22:27.073 "unmap": true, 00:22:27.073 "flush": true, 00:22:27.073 "reset": true, 00:22:27.073 "nvme_admin": false, 00:22:27.073 "nvme_io": false, 00:22:27.073 "nvme_io_md": false, 00:22:27.073 "write_zeroes": true, 00:22:27.073 "zcopy": true, 00:22:27.073 "get_zone_info": false, 00:22:27.073 "zone_management": false, 00:22:27.073 "zone_append": false, 00:22:27.073 "compare": false, 00:22:27.073 "compare_and_write": false, 00:22:27.073 "abort": true, 00:22:27.073 "seek_hole": false, 00:22:27.073 "seek_data": false, 00:22:27.073 "copy": true, 00:22:27.073 "nvme_iov_md": false 00:22:27.073 }, 00:22:27.073 "memory_domains": [ 00:22:27.073 { 00:22:27.073 "dma_device_id": "system", 00:22:27.073 "dma_device_type": 1 00:22:27.073 }, 00:22:27.073 { 00:22:27.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:27.073 "dma_device_type": 2 00:22:27.073 } 00:22:27.073 ], 00:22:27.073 "driver_specific": {} 00:22:27.073 }' 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:27.073 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.332 22:51:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.332 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:27.332 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:27.332 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:27.332 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:27.332 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:27.332 "name": "BaseBdev3", 00:22:27.332 "aliases": [ 00:22:27.332 "f633012e-b219-42b3-adc2-036c777717a7" 00:22:27.332 ], 00:22:27.332 "product_name": "Malloc disk", 00:22:27.332 "block_size": 512, 00:22:27.332 "num_blocks": 65536, 00:22:27.332 "uuid": "f633012e-b219-42b3-adc2-036c777717a7", 00:22:27.332 "assigned_rate_limits": { 00:22:27.332 "rw_ios_per_sec": 0, 00:22:27.332 "rw_mbytes_per_sec": 0, 00:22:27.332 "r_mbytes_per_sec": 0, 00:22:27.332 "w_mbytes_per_sec": 0 00:22:27.332 }, 00:22:27.332 "claimed": true, 00:22:27.332 "claim_type": "exclusive_write", 00:22:27.332 "zoned": false, 00:22:27.332 "supported_io_types": { 00:22:27.332 "read": true, 00:22:27.332 "write": true, 00:22:27.332 "unmap": true, 00:22:27.332 "flush": true, 00:22:27.332 "reset": true, 00:22:27.332 "nvme_admin": false, 00:22:27.332 "nvme_io": false, 00:22:27.332 "nvme_io_md": false, 00:22:27.332 "write_zeroes": true, 00:22:27.332 "zcopy": true, 00:22:27.332 "get_zone_info": false, 00:22:27.332 "zone_management": false, 00:22:27.332 "zone_append": false, 00:22:27.332 "compare": false, 00:22:27.332 "compare_and_write": false, 00:22:27.332 "abort": true, 00:22:27.332 "seek_hole": false, 00:22:27.332 "seek_data": false, 00:22:27.332 "copy": true, 00:22:27.332 "nvme_iov_md": false 00:22:27.332 }, 00:22:27.332 "memory_domains": [ 00:22:27.332 { 00:22:27.332 "dma_device_id": "system", 00:22:27.332 "dma_device_type": 1 00:22:27.332 }, 00:22:27.332 { 00:22:27.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:27.332 "dma_device_type": 2 00:22:27.332 } 00:22:27.332 ], 00:22:27.332 "driver_specific": {} 00:22:27.332 }' 00:22:27.332 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.591 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:27.591 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:27.591 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.591 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:27.591 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:27.591 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.591 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:27.849 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:27.849 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.849 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:27.849 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:27.849 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:27.849 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:27.849 22:51:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:28.415 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:28.415 "name": "BaseBdev4", 00:22:28.415 "aliases": [ 00:22:28.415 "ceee1cb7-56a4-4c4e-8e1a-03246f73fb75" 00:22:28.415 ], 00:22:28.415 "product_name": "Malloc disk", 00:22:28.415 "block_size": 512, 00:22:28.415 "num_blocks": 65536, 00:22:28.415 "uuid": "ceee1cb7-56a4-4c4e-8e1a-03246f73fb75", 00:22:28.415 "assigned_rate_limits": { 00:22:28.415 "rw_ios_per_sec": 0, 00:22:28.415 "rw_mbytes_per_sec": 0, 00:22:28.415 "r_mbytes_per_sec": 0, 00:22:28.415 "w_mbytes_per_sec": 0 00:22:28.415 }, 00:22:28.415 "claimed": true, 00:22:28.415 "claim_type": "exclusive_write", 00:22:28.415 "zoned": false, 00:22:28.415 "supported_io_types": { 00:22:28.415 "read": true, 00:22:28.415 "write": true, 00:22:28.415 "unmap": true, 00:22:28.415 "flush": true, 00:22:28.415 "reset": true, 00:22:28.415 "nvme_admin": false, 00:22:28.415 "nvme_io": false, 00:22:28.415 "nvme_io_md": false, 00:22:28.415 "write_zeroes": true, 00:22:28.415 "zcopy": true, 00:22:28.415 "get_zone_info": false, 00:22:28.415 "zone_management": false, 00:22:28.415 "zone_append": false, 00:22:28.415 "compare": false, 00:22:28.415 "compare_and_write": false, 00:22:28.415 "abort": true, 00:22:28.415 "seek_hole": false, 00:22:28.415 "seek_data": false, 00:22:28.415 "copy": true, 00:22:28.415 "nvme_iov_md": false 00:22:28.415 }, 00:22:28.415 "memory_domains": [ 00:22:28.415 { 00:22:28.415 "dma_device_id": "system", 00:22:28.415 "dma_device_type": 1 00:22:28.415 }, 00:22:28.415 { 00:22:28.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:28.415 "dma_device_type": 2 00:22:28.415 } 00:22:28.415 ], 00:22:28.415 "driver_specific": {} 00:22:28.415 }' 00:22:28.415 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.415 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:28.415 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:28.415 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.673 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:28.673 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:28.673 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.673 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:28.673 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:28.673 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.931 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:28.931 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:28.931 22:51:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:29.498 [2024-07-15 22:51:14.128809] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.498 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:30.066 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.066 "name": "Existed_Raid", 00:22:30.066 "uuid": "fd2af692-01ff-412e-8f2d-f615eceafe14", 00:22:30.066 "strip_size_kb": 0, 00:22:30.066 "state": "online", 00:22:30.066 "raid_level": "raid1", 00:22:30.066 "superblock": true, 00:22:30.066 "num_base_bdevs": 4, 00:22:30.066 "num_base_bdevs_discovered": 3, 00:22:30.066 "num_base_bdevs_operational": 3, 00:22:30.066 "base_bdevs_list": [ 00:22:30.066 { 00:22:30.066 "name": null, 00:22:30.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.066 "is_configured": false, 00:22:30.066 "data_offset": 2048, 00:22:30.066 "data_size": 63488 00:22:30.066 }, 00:22:30.066 { 00:22:30.066 "name": "BaseBdev2", 00:22:30.066 "uuid": "422597d0-89f6-4de9-8c62-60b86f172a22", 00:22:30.066 "is_configured": true, 00:22:30.066 "data_offset": 2048, 00:22:30.066 "data_size": 63488 00:22:30.066 }, 00:22:30.066 { 00:22:30.066 "name": "BaseBdev3", 00:22:30.066 "uuid": "f633012e-b219-42b3-adc2-036c777717a7", 00:22:30.066 "is_configured": true, 00:22:30.066 "data_offset": 2048, 00:22:30.066 "data_size": 63488 00:22:30.066 }, 00:22:30.066 { 00:22:30.066 "name": "BaseBdev4", 00:22:30.066 "uuid": "ceee1cb7-56a4-4c4e-8e1a-03246f73fb75", 00:22:30.066 "is_configured": true, 00:22:30.066 "data_offset": 2048, 00:22:30.066 "data_size": 63488 00:22:30.066 } 00:22:30.066 ] 00:22:30.066 }' 00:22:30.066 22:51:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.066 22:51:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:30.660 22:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:30.660 22:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:30.660 22:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.660 22:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:31.228 22:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:31.228 22:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:31.228 22:51:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:31.228 [2024-07-15 22:51:16.051900] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:31.228 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:31.228 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:31.228 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.228 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:31.795 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:31.795 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:31.795 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:32.078 [2024-07-15 22:51:16.830510] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:32.078 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:32.078 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:32.078 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.078 22:51:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:32.337 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:32.337 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:32.337 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:32.596 [2024-07-15 22:51:17.324010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:32.596 [2024-07-15 22:51:17.324095] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:32.596 [2024-07-15 22:51:17.334797] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:32.596 [2024-07-15 22:51:17.334832] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:32.596 [2024-07-15 22:51:17.334844] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12e0350 name Existed_Raid, state offline 00:22:32.596 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:32.596 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:32.596 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.596 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:32.854 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:32.854 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:32.854 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:32.854 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:32.854 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:32.854 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:33.113 BaseBdev2 00:22:33.113 22:51:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:33.113 22:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:33.113 22:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:33.113 22:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:33.113 22:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:33.113 22:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:33.113 22:51:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:33.680 22:51:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:33.938 [ 00:22:33.938 { 00:22:33.938 "name": "BaseBdev2", 00:22:33.938 "aliases": [ 00:22:33.938 "aac71433-9184-4c80-98a5-7160006b14f9" 00:22:33.938 ], 00:22:33.938 "product_name": "Malloc disk", 00:22:33.938 "block_size": 512, 00:22:33.938 "num_blocks": 65536, 00:22:33.938 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:33.938 "assigned_rate_limits": { 00:22:33.938 "rw_ios_per_sec": 0, 00:22:33.938 "rw_mbytes_per_sec": 0, 00:22:33.938 "r_mbytes_per_sec": 0, 00:22:33.938 "w_mbytes_per_sec": 0 00:22:33.938 }, 00:22:33.938 "claimed": false, 00:22:33.938 "zoned": false, 00:22:33.938 "supported_io_types": { 00:22:33.938 "read": true, 00:22:33.938 "write": true, 00:22:33.938 "unmap": true, 00:22:33.938 "flush": true, 00:22:33.938 "reset": true, 00:22:33.938 "nvme_admin": false, 00:22:33.938 "nvme_io": false, 00:22:33.938 "nvme_io_md": false, 00:22:33.938 "write_zeroes": true, 00:22:33.938 "zcopy": true, 00:22:33.938 "get_zone_info": false, 00:22:33.938 "zone_management": false, 00:22:33.938 "zone_append": false, 00:22:33.938 "compare": false, 00:22:33.938 "compare_and_write": false, 00:22:33.938 "abort": true, 00:22:33.938 "seek_hole": false, 00:22:33.938 "seek_data": false, 00:22:33.938 "copy": true, 00:22:33.938 "nvme_iov_md": false 00:22:33.938 }, 00:22:33.938 "memory_domains": [ 00:22:33.938 { 00:22:33.938 "dma_device_id": "system", 00:22:33.938 "dma_device_type": 1 00:22:33.938 }, 00:22:33.938 { 00:22:33.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:33.938 "dma_device_type": 2 00:22:33.938 } 00:22:33.938 ], 00:22:33.938 "driver_specific": {} 00:22:33.938 } 00:22:33.938 ] 00:22:33.938 22:51:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:33.938 22:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:33.938 22:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:33.938 22:51:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:34.196 BaseBdev3 00:22:34.453 22:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:34.453 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:34.453 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:34.453 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:34.453 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:34.453 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:34.453 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:34.711 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:34.968 [ 00:22:34.968 { 00:22:34.968 "name": "BaseBdev3", 00:22:34.968 "aliases": [ 00:22:34.968 "367941a4-7527-44e8-b10b-4d0c801ea6d2" 00:22:34.968 ], 00:22:34.968 "product_name": "Malloc disk", 00:22:34.968 "block_size": 512, 00:22:34.968 "num_blocks": 65536, 00:22:34.968 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:34.968 "assigned_rate_limits": { 00:22:34.968 "rw_ios_per_sec": 0, 00:22:34.968 "rw_mbytes_per_sec": 0, 00:22:34.968 "r_mbytes_per_sec": 0, 00:22:34.968 "w_mbytes_per_sec": 0 00:22:34.968 }, 00:22:34.968 "claimed": false, 00:22:34.968 "zoned": false, 00:22:34.968 "supported_io_types": { 00:22:34.968 "read": true, 00:22:34.968 "write": true, 00:22:34.968 "unmap": true, 00:22:34.968 "flush": true, 00:22:34.968 "reset": true, 00:22:34.968 "nvme_admin": false, 00:22:34.968 "nvme_io": false, 00:22:34.968 "nvme_io_md": false, 00:22:34.968 "write_zeroes": true, 00:22:34.968 "zcopy": true, 00:22:34.968 "get_zone_info": false, 00:22:34.968 "zone_management": false, 00:22:34.969 "zone_append": false, 00:22:34.969 "compare": false, 00:22:34.969 "compare_and_write": false, 00:22:34.969 "abort": true, 00:22:34.969 "seek_hole": false, 00:22:34.969 "seek_data": false, 00:22:34.969 "copy": true, 00:22:34.969 "nvme_iov_md": false 00:22:34.969 }, 00:22:34.969 "memory_domains": [ 00:22:34.969 { 00:22:34.969 "dma_device_id": "system", 00:22:34.969 "dma_device_type": 1 00:22:34.969 }, 00:22:34.969 { 00:22:34.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:34.969 "dma_device_type": 2 00:22:34.969 } 00:22:34.969 ], 00:22:34.969 "driver_specific": {} 00:22:34.969 } 00:22:34.969 ] 00:22:35.227 22:51:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:35.227 22:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:35.227 22:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:35.227 22:51:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:35.227 BaseBdev4 00:22:35.227 22:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:35.227 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:35.227 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:35.227 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:35.227 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:35.227 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:35.227 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:35.484 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:36.051 [ 00:22:36.051 { 00:22:36.051 "name": "BaseBdev4", 00:22:36.051 "aliases": [ 00:22:36.051 "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac" 00:22:36.051 ], 00:22:36.051 "product_name": "Malloc disk", 00:22:36.051 "block_size": 512, 00:22:36.051 "num_blocks": 65536, 00:22:36.051 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:36.051 "assigned_rate_limits": { 00:22:36.051 "rw_ios_per_sec": 0, 00:22:36.051 "rw_mbytes_per_sec": 0, 00:22:36.051 "r_mbytes_per_sec": 0, 00:22:36.051 "w_mbytes_per_sec": 0 00:22:36.051 }, 00:22:36.051 "claimed": false, 00:22:36.051 "zoned": false, 00:22:36.051 "supported_io_types": { 00:22:36.051 "read": true, 00:22:36.051 "write": true, 00:22:36.051 "unmap": true, 00:22:36.051 "flush": true, 00:22:36.051 "reset": true, 00:22:36.051 "nvme_admin": false, 00:22:36.051 "nvme_io": false, 00:22:36.051 "nvme_io_md": false, 00:22:36.051 "write_zeroes": true, 00:22:36.051 "zcopy": true, 00:22:36.051 "get_zone_info": false, 00:22:36.051 "zone_management": false, 00:22:36.051 "zone_append": false, 00:22:36.051 "compare": false, 00:22:36.051 "compare_and_write": false, 00:22:36.051 "abort": true, 00:22:36.051 "seek_hole": false, 00:22:36.051 "seek_data": false, 00:22:36.051 "copy": true, 00:22:36.051 "nvme_iov_md": false 00:22:36.051 }, 00:22:36.051 "memory_domains": [ 00:22:36.051 { 00:22:36.051 "dma_device_id": "system", 00:22:36.051 "dma_device_type": 1 00:22:36.051 }, 00:22:36.051 { 00:22:36.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.051 "dma_device_type": 2 00:22:36.051 } 00:22:36.051 ], 00:22:36.051 "driver_specific": {} 00:22:36.051 } 00:22:36.051 ] 00:22:36.051 22:51:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:36.051 22:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:36.051 22:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:36.051 22:51:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:36.309 [2024-07-15 22:51:21.122204] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:36.309 [2024-07-15 22:51:21.122245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:36.309 [2024-07-15 22:51:21.122263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:36.309 [2024-07-15 22:51:21.123591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:36.309 [2024-07-15 22:51:21.123634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.309 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:36.567 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.567 "name": "Existed_Raid", 00:22:36.567 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:36.567 "strip_size_kb": 0, 00:22:36.567 "state": "configuring", 00:22:36.567 "raid_level": "raid1", 00:22:36.567 "superblock": true, 00:22:36.567 "num_base_bdevs": 4, 00:22:36.567 "num_base_bdevs_discovered": 3, 00:22:36.567 "num_base_bdevs_operational": 4, 00:22:36.567 "base_bdevs_list": [ 00:22:36.567 { 00:22:36.567 "name": "BaseBdev1", 00:22:36.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.567 "is_configured": false, 00:22:36.567 "data_offset": 0, 00:22:36.567 "data_size": 0 00:22:36.567 }, 00:22:36.567 { 00:22:36.567 "name": "BaseBdev2", 00:22:36.567 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:36.567 "is_configured": true, 00:22:36.567 "data_offset": 2048, 00:22:36.567 "data_size": 63488 00:22:36.567 }, 00:22:36.567 { 00:22:36.567 "name": "BaseBdev3", 00:22:36.567 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:36.567 "is_configured": true, 00:22:36.567 "data_offset": 2048, 00:22:36.567 "data_size": 63488 00:22:36.567 }, 00:22:36.567 { 00:22:36.567 "name": "BaseBdev4", 00:22:36.567 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:36.567 "is_configured": true, 00:22:36.567 "data_offset": 2048, 00:22:36.567 "data_size": 63488 00:22:36.567 } 00:22:36.567 ] 00:22:36.567 }' 00:22:36.567 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.567 22:51:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:37.164 22:51:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:37.421 [2024-07-15 22:51:22.160934] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.421 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.679 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.679 "name": "Existed_Raid", 00:22:37.679 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:37.679 "strip_size_kb": 0, 00:22:37.679 "state": "configuring", 00:22:37.679 "raid_level": "raid1", 00:22:37.679 "superblock": true, 00:22:37.679 "num_base_bdevs": 4, 00:22:37.679 "num_base_bdevs_discovered": 2, 00:22:37.679 "num_base_bdevs_operational": 4, 00:22:37.679 "base_bdevs_list": [ 00:22:37.679 { 00:22:37.679 "name": "BaseBdev1", 00:22:37.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.679 "is_configured": false, 00:22:37.679 "data_offset": 0, 00:22:37.679 "data_size": 0 00:22:37.679 }, 00:22:37.679 { 00:22:37.679 "name": null, 00:22:37.679 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:37.679 "is_configured": false, 00:22:37.679 "data_offset": 2048, 00:22:37.679 "data_size": 63488 00:22:37.679 }, 00:22:37.679 { 00:22:37.679 "name": "BaseBdev3", 00:22:37.679 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:37.679 "is_configured": true, 00:22:37.679 "data_offset": 2048, 00:22:37.679 "data_size": 63488 00:22:37.679 }, 00:22:37.679 { 00:22:37.679 "name": "BaseBdev4", 00:22:37.679 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:37.679 "is_configured": true, 00:22:37.679 "data_offset": 2048, 00:22:37.679 "data_size": 63488 00:22:37.679 } 00:22:37.679 ] 00:22:37.679 }' 00:22:37.679 22:51:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.679 22:51:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:38.245 22:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.245 22:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:38.502 22:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:38.502 22:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:38.759 [2024-07-15 22:51:23.524018] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:38.759 BaseBdev1 00:22:38.759 22:51:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:38.759 22:51:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:38.759 22:51:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:38.759 22:51:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:38.759 22:51:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:38.759 22:51:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:38.759 22:51:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:39.017 22:51:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:39.276 [ 00:22:39.276 { 00:22:39.276 "name": "BaseBdev1", 00:22:39.276 "aliases": [ 00:22:39.276 "efa1c275-2c2c-4205-9898-9fa72d043915" 00:22:39.276 ], 00:22:39.276 "product_name": "Malloc disk", 00:22:39.276 "block_size": 512, 00:22:39.276 "num_blocks": 65536, 00:22:39.276 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:39.276 "assigned_rate_limits": { 00:22:39.276 "rw_ios_per_sec": 0, 00:22:39.276 "rw_mbytes_per_sec": 0, 00:22:39.276 "r_mbytes_per_sec": 0, 00:22:39.276 "w_mbytes_per_sec": 0 00:22:39.276 }, 00:22:39.276 "claimed": true, 00:22:39.276 "claim_type": "exclusive_write", 00:22:39.276 "zoned": false, 00:22:39.276 "supported_io_types": { 00:22:39.276 "read": true, 00:22:39.276 "write": true, 00:22:39.276 "unmap": true, 00:22:39.276 "flush": true, 00:22:39.276 "reset": true, 00:22:39.276 "nvme_admin": false, 00:22:39.276 "nvme_io": false, 00:22:39.276 "nvme_io_md": false, 00:22:39.276 "write_zeroes": true, 00:22:39.276 "zcopy": true, 00:22:39.276 "get_zone_info": false, 00:22:39.276 "zone_management": false, 00:22:39.276 "zone_append": false, 00:22:39.276 "compare": false, 00:22:39.276 "compare_and_write": false, 00:22:39.276 "abort": true, 00:22:39.276 "seek_hole": false, 00:22:39.276 "seek_data": false, 00:22:39.276 "copy": true, 00:22:39.276 "nvme_iov_md": false 00:22:39.276 }, 00:22:39.276 "memory_domains": [ 00:22:39.276 { 00:22:39.276 "dma_device_id": "system", 00:22:39.276 "dma_device_type": 1 00:22:39.276 }, 00:22:39.276 { 00:22:39.276 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:39.276 "dma_device_type": 2 00:22:39.276 } 00:22:39.276 ], 00:22:39.276 "driver_specific": {} 00:22:39.276 } 00:22:39.276 ] 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.276 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:39.534 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:39.534 "name": "Existed_Raid", 00:22:39.534 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:39.534 "strip_size_kb": 0, 00:22:39.534 "state": "configuring", 00:22:39.534 "raid_level": "raid1", 00:22:39.534 "superblock": true, 00:22:39.534 "num_base_bdevs": 4, 00:22:39.534 "num_base_bdevs_discovered": 3, 00:22:39.535 "num_base_bdevs_operational": 4, 00:22:39.535 "base_bdevs_list": [ 00:22:39.535 { 00:22:39.535 "name": "BaseBdev1", 00:22:39.535 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:39.535 "is_configured": true, 00:22:39.535 "data_offset": 2048, 00:22:39.535 "data_size": 63488 00:22:39.535 }, 00:22:39.535 { 00:22:39.535 "name": null, 00:22:39.535 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:39.535 "is_configured": false, 00:22:39.535 "data_offset": 2048, 00:22:39.535 "data_size": 63488 00:22:39.535 }, 00:22:39.535 { 00:22:39.535 "name": "BaseBdev3", 00:22:39.535 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:39.535 "is_configured": true, 00:22:39.535 "data_offset": 2048, 00:22:39.535 "data_size": 63488 00:22:39.535 }, 00:22:39.535 { 00:22:39.535 "name": "BaseBdev4", 00:22:39.535 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:39.535 "is_configured": true, 00:22:39.535 "data_offset": 2048, 00:22:39.535 "data_size": 63488 00:22:39.535 } 00:22:39.535 ] 00:22:39.535 }' 00:22:39.535 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:39.535 22:51:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:40.111 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.111 22:51:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:40.111 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:40.111 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:40.680 [2024-07-15 22:51:25.493261] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.680 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:40.939 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:40.939 "name": "Existed_Raid", 00:22:40.939 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:40.939 "strip_size_kb": 0, 00:22:40.939 "state": "configuring", 00:22:40.939 "raid_level": "raid1", 00:22:40.939 "superblock": true, 00:22:40.939 "num_base_bdevs": 4, 00:22:40.939 "num_base_bdevs_discovered": 2, 00:22:40.939 "num_base_bdevs_operational": 4, 00:22:40.939 "base_bdevs_list": [ 00:22:40.939 { 00:22:40.939 "name": "BaseBdev1", 00:22:40.939 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:40.939 "is_configured": true, 00:22:40.939 "data_offset": 2048, 00:22:40.939 "data_size": 63488 00:22:40.939 }, 00:22:40.939 { 00:22:40.939 "name": null, 00:22:40.939 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:40.939 "is_configured": false, 00:22:40.939 "data_offset": 2048, 00:22:40.939 "data_size": 63488 00:22:40.939 }, 00:22:40.939 { 00:22:40.939 "name": null, 00:22:40.939 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:40.939 "is_configured": false, 00:22:40.939 "data_offset": 2048, 00:22:40.939 "data_size": 63488 00:22:40.939 }, 00:22:40.939 { 00:22:40.939 "name": "BaseBdev4", 00:22:40.939 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:40.939 "is_configured": true, 00:22:40.939 "data_offset": 2048, 00:22:40.939 "data_size": 63488 00:22:40.939 } 00:22:40.939 ] 00:22:40.939 }' 00:22:40.939 22:51:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:40.939 22:51:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:41.599 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.599 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:41.857 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:41.857 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:42.116 [2024-07-15 22:51:26.856906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.116 22:51:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:42.374 22:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.374 "name": "Existed_Raid", 00:22:42.374 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:42.374 "strip_size_kb": 0, 00:22:42.374 "state": "configuring", 00:22:42.374 "raid_level": "raid1", 00:22:42.374 "superblock": true, 00:22:42.374 "num_base_bdevs": 4, 00:22:42.374 "num_base_bdevs_discovered": 3, 00:22:42.374 "num_base_bdevs_operational": 4, 00:22:42.374 "base_bdevs_list": [ 00:22:42.374 { 00:22:42.374 "name": "BaseBdev1", 00:22:42.374 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:42.374 "is_configured": true, 00:22:42.374 "data_offset": 2048, 00:22:42.374 "data_size": 63488 00:22:42.374 }, 00:22:42.374 { 00:22:42.374 "name": null, 00:22:42.374 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:42.374 "is_configured": false, 00:22:42.374 "data_offset": 2048, 00:22:42.374 "data_size": 63488 00:22:42.374 }, 00:22:42.374 { 00:22:42.374 "name": "BaseBdev3", 00:22:42.374 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:42.374 "is_configured": true, 00:22:42.374 "data_offset": 2048, 00:22:42.374 "data_size": 63488 00:22:42.374 }, 00:22:42.374 { 00:22:42.374 "name": "BaseBdev4", 00:22:42.374 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:42.374 "is_configured": true, 00:22:42.374 "data_offset": 2048, 00:22:42.374 "data_size": 63488 00:22:42.374 } 00:22:42.374 ] 00:22:42.374 }' 00:22:42.374 22:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.374 22:51:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:42.939 22:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.940 22:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:43.224 22:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:43.224 22:51:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:43.482 [2024-07-15 22:51:28.136313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:43.482 "name": "Existed_Raid", 00:22:43.482 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:43.482 "strip_size_kb": 0, 00:22:43.482 "state": "configuring", 00:22:43.482 "raid_level": "raid1", 00:22:43.482 "superblock": true, 00:22:43.482 "num_base_bdevs": 4, 00:22:43.482 "num_base_bdevs_discovered": 2, 00:22:43.482 "num_base_bdevs_operational": 4, 00:22:43.482 "base_bdevs_list": [ 00:22:43.482 { 00:22:43.482 "name": null, 00:22:43.482 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:43.482 "is_configured": false, 00:22:43.482 "data_offset": 2048, 00:22:43.482 "data_size": 63488 00:22:43.482 }, 00:22:43.482 { 00:22:43.482 "name": null, 00:22:43.482 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:43.482 "is_configured": false, 00:22:43.482 "data_offset": 2048, 00:22:43.482 "data_size": 63488 00:22:43.482 }, 00:22:43.482 { 00:22:43.482 "name": "BaseBdev3", 00:22:43.482 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:43.482 "is_configured": true, 00:22:43.482 "data_offset": 2048, 00:22:43.482 "data_size": 63488 00:22:43.482 }, 00:22:43.482 { 00:22:43.482 "name": "BaseBdev4", 00:22:43.482 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:43.482 "is_configured": true, 00:22:43.482 "data_offset": 2048, 00:22:43.482 "data_size": 63488 00:22:43.482 } 00:22:43.482 ] 00:22:43.482 }' 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:43.482 22:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:44.046 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.046 22:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:44.303 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:44.303 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:44.575 [2024-07-15 22:51:29.370155] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.575 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:44.833 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.833 "name": "Existed_Raid", 00:22:44.833 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:44.833 "strip_size_kb": 0, 00:22:44.833 "state": "configuring", 00:22:44.833 "raid_level": "raid1", 00:22:44.833 "superblock": true, 00:22:44.833 "num_base_bdevs": 4, 00:22:44.833 "num_base_bdevs_discovered": 3, 00:22:44.833 "num_base_bdevs_operational": 4, 00:22:44.833 "base_bdevs_list": [ 00:22:44.833 { 00:22:44.833 "name": null, 00:22:44.833 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:44.833 "is_configured": false, 00:22:44.833 "data_offset": 2048, 00:22:44.833 "data_size": 63488 00:22:44.833 }, 00:22:44.833 { 00:22:44.833 "name": "BaseBdev2", 00:22:44.833 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:44.833 "is_configured": true, 00:22:44.833 "data_offset": 2048, 00:22:44.833 "data_size": 63488 00:22:44.833 }, 00:22:44.833 { 00:22:44.833 "name": "BaseBdev3", 00:22:44.833 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:44.833 "is_configured": true, 00:22:44.833 "data_offset": 2048, 00:22:44.833 "data_size": 63488 00:22:44.833 }, 00:22:44.833 { 00:22:44.833 "name": "BaseBdev4", 00:22:44.833 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:44.833 "is_configured": true, 00:22:44.833 "data_offset": 2048, 00:22:44.833 "data_size": 63488 00:22:44.833 } 00:22:44.833 ] 00:22:44.833 }' 00:22:44.833 22:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.833 22:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:45.398 22:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.398 22:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:45.656 22:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:45.656 22:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.656 22:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:45.914 22:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u efa1c275-2c2c-4205-9898-9fa72d043915 00:22:46.171 [2024-07-15 22:51:31.011001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:46.172 [2024-07-15 22:51:31.011174] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12e2180 00:22:46.172 [2024-07-15 22:51:31.011188] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:46.172 [2024-07-15 22:51:31.011368] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12e2c20 00:22:46.172 [2024-07-15 22:51:31.011502] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12e2180 00:22:46.172 [2024-07-15 22:51:31.011513] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12e2180 00:22:46.172 [2024-07-15 22:51:31.011611] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:46.172 NewBaseBdev 00:22:46.172 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:46.172 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:46.172 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:46.172 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:46.172 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:46.172 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:46.172 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:46.429 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:46.688 [ 00:22:46.688 { 00:22:46.688 "name": "NewBaseBdev", 00:22:46.688 "aliases": [ 00:22:46.688 "efa1c275-2c2c-4205-9898-9fa72d043915" 00:22:46.688 ], 00:22:46.688 "product_name": "Malloc disk", 00:22:46.688 "block_size": 512, 00:22:46.688 "num_blocks": 65536, 00:22:46.688 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:46.688 "assigned_rate_limits": { 00:22:46.688 "rw_ios_per_sec": 0, 00:22:46.688 "rw_mbytes_per_sec": 0, 00:22:46.688 "r_mbytes_per_sec": 0, 00:22:46.688 "w_mbytes_per_sec": 0 00:22:46.688 }, 00:22:46.688 "claimed": true, 00:22:46.688 "claim_type": "exclusive_write", 00:22:46.688 "zoned": false, 00:22:46.688 "supported_io_types": { 00:22:46.688 "read": true, 00:22:46.688 "write": true, 00:22:46.688 "unmap": true, 00:22:46.688 "flush": true, 00:22:46.688 "reset": true, 00:22:46.688 "nvme_admin": false, 00:22:46.688 "nvme_io": false, 00:22:46.688 "nvme_io_md": false, 00:22:46.688 "write_zeroes": true, 00:22:46.688 "zcopy": true, 00:22:46.688 "get_zone_info": false, 00:22:46.688 "zone_management": false, 00:22:46.688 "zone_append": false, 00:22:46.688 "compare": false, 00:22:46.688 "compare_and_write": false, 00:22:46.688 "abort": true, 00:22:46.688 "seek_hole": false, 00:22:46.688 "seek_data": false, 00:22:46.688 "copy": true, 00:22:46.688 "nvme_iov_md": false 00:22:46.688 }, 00:22:46.688 "memory_domains": [ 00:22:46.688 { 00:22:46.688 "dma_device_id": "system", 00:22:46.688 "dma_device_type": 1 00:22:46.688 }, 00:22:46.688 { 00:22:46.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.688 "dma_device_type": 2 00:22:46.688 } 00:22:46.688 ], 00:22:46.688 "driver_specific": {} 00:22:46.688 } 00:22:46.688 ] 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.688 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:46.947 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.947 "name": "Existed_Raid", 00:22:46.947 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:46.947 "strip_size_kb": 0, 00:22:46.947 "state": "online", 00:22:46.947 "raid_level": "raid1", 00:22:46.947 "superblock": true, 00:22:46.947 "num_base_bdevs": 4, 00:22:46.947 "num_base_bdevs_discovered": 4, 00:22:46.947 "num_base_bdevs_operational": 4, 00:22:46.947 "base_bdevs_list": [ 00:22:46.947 { 00:22:46.947 "name": "NewBaseBdev", 00:22:46.947 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:46.947 "is_configured": true, 00:22:46.947 "data_offset": 2048, 00:22:46.947 "data_size": 63488 00:22:46.947 }, 00:22:46.947 { 00:22:46.947 "name": "BaseBdev2", 00:22:46.947 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:46.947 "is_configured": true, 00:22:46.947 "data_offset": 2048, 00:22:46.947 "data_size": 63488 00:22:46.947 }, 00:22:46.947 { 00:22:46.947 "name": "BaseBdev3", 00:22:46.947 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:46.947 "is_configured": true, 00:22:46.947 "data_offset": 2048, 00:22:46.947 "data_size": 63488 00:22:46.947 }, 00:22:46.947 { 00:22:46.947 "name": "BaseBdev4", 00:22:46.947 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:46.947 "is_configured": true, 00:22:46.947 "data_offset": 2048, 00:22:46.947 "data_size": 63488 00:22:46.947 } 00:22:46.947 ] 00:22:46.947 }' 00:22:46.947 22:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.947 22:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:47.513 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:47.770 [2024-07-15 22:51:32.599531] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:47.770 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:47.770 "name": "Existed_Raid", 00:22:47.770 "aliases": [ 00:22:47.770 "042b5e31-0ff1-4ce0-9233-28007dc6970a" 00:22:47.770 ], 00:22:47.770 "product_name": "Raid Volume", 00:22:47.770 "block_size": 512, 00:22:47.770 "num_blocks": 63488, 00:22:47.770 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:47.770 "assigned_rate_limits": { 00:22:47.770 "rw_ios_per_sec": 0, 00:22:47.770 "rw_mbytes_per_sec": 0, 00:22:47.770 "r_mbytes_per_sec": 0, 00:22:47.770 "w_mbytes_per_sec": 0 00:22:47.770 }, 00:22:47.770 "claimed": false, 00:22:47.770 "zoned": false, 00:22:47.770 "supported_io_types": { 00:22:47.770 "read": true, 00:22:47.770 "write": true, 00:22:47.770 "unmap": false, 00:22:47.770 "flush": false, 00:22:47.770 "reset": true, 00:22:47.770 "nvme_admin": false, 00:22:47.770 "nvme_io": false, 00:22:47.770 "nvme_io_md": false, 00:22:47.770 "write_zeroes": true, 00:22:47.770 "zcopy": false, 00:22:47.770 "get_zone_info": false, 00:22:47.770 "zone_management": false, 00:22:47.770 "zone_append": false, 00:22:47.770 "compare": false, 00:22:47.770 "compare_and_write": false, 00:22:47.770 "abort": false, 00:22:47.770 "seek_hole": false, 00:22:47.770 "seek_data": false, 00:22:47.770 "copy": false, 00:22:47.770 "nvme_iov_md": false 00:22:47.770 }, 00:22:47.770 "memory_domains": [ 00:22:47.770 { 00:22:47.770 "dma_device_id": "system", 00:22:47.770 "dma_device_type": 1 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.770 "dma_device_type": 2 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "dma_device_id": "system", 00:22:47.770 "dma_device_type": 1 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.770 "dma_device_type": 2 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "dma_device_id": "system", 00:22:47.770 "dma_device_type": 1 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.770 "dma_device_type": 2 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "dma_device_id": "system", 00:22:47.770 "dma_device_type": 1 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:47.770 "dma_device_type": 2 00:22:47.770 } 00:22:47.770 ], 00:22:47.770 "driver_specific": { 00:22:47.770 "raid": { 00:22:47.770 "uuid": "042b5e31-0ff1-4ce0-9233-28007dc6970a", 00:22:47.770 "strip_size_kb": 0, 00:22:47.770 "state": "online", 00:22:47.770 "raid_level": "raid1", 00:22:47.770 "superblock": true, 00:22:47.770 "num_base_bdevs": 4, 00:22:47.770 "num_base_bdevs_discovered": 4, 00:22:47.770 "num_base_bdevs_operational": 4, 00:22:47.770 "base_bdevs_list": [ 00:22:47.770 { 00:22:47.770 "name": "NewBaseBdev", 00:22:47.770 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:47.770 "is_configured": true, 00:22:47.770 "data_offset": 2048, 00:22:47.770 "data_size": 63488 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "name": "BaseBdev2", 00:22:47.770 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:47.770 "is_configured": true, 00:22:47.770 "data_offset": 2048, 00:22:47.770 "data_size": 63488 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "name": "BaseBdev3", 00:22:47.770 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:47.770 "is_configured": true, 00:22:47.770 "data_offset": 2048, 00:22:47.770 "data_size": 63488 00:22:47.770 }, 00:22:47.770 { 00:22:47.770 "name": "BaseBdev4", 00:22:47.770 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:47.770 "is_configured": true, 00:22:47.770 "data_offset": 2048, 00:22:47.770 "data_size": 63488 00:22:47.770 } 00:22:47.770 ] 00:22:47.770 } 00:22:47.770 } 00:22:47.770 }' 00:22:47.770 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:47.770 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:47.770 BaseBdev2 00:22:47.770 BaseBdev3 00:22:47.770 BaseBdev4' 00:22:47.771 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:47.771 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:47.771 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:48.028 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:48.028 "name": "NewBaseBdev", 00:22:48.028 "aliases": [ 00:22:48.028 "efa1c275-2c2c-4205-9898-9fa72d043915" 00:22:48.028 ], 00:22:48.028 "product_name": "Malloc disk", 00:22:48.028 "block_size": 512, 00:22:48.028 "num_blocks": 65536, 00:22:48.029 "uuid": "efa1c275-2c2c-4205-9898-9fa72d043915", 00:22:48.029 "assigned_rate_limits": { 00:22:48.029 "rw_ios_per_sec": 0, 00:22:48.029 "rw_mbytes_per_sec": 0, 00:22:48.029 "r_mbytes_per_sec": 0, 00:22:48.029 "w_mbytes_per_sec": 0 00:22:48.029 }, 00:22:48.029 "claimed": true, 00:22:48.029 "claim_type": "exclusive_write", 00:22:48.029 "zoned": false, 00:22:48.029 "supported_io_types": { 00:22:48.029 "read": true, 00:22:48.029 "write": true, 00:22:48.029 "unmap": true, 00:22:48.029 "flush": true, 00:22:48.029 "reset": true, 00:22:48.029 "nvme_admin": false, 00:22:48.029 "nvme_io": false, 00:22:48.029 "nvme_io_md": false, 00:22:48.029 "write_zeroes": true, 00:22:48.029 "zcopy": true, 00:22:48.029 "get_zone_info": false, 00:22:48.029 "zone_management": false, 00:22:48.029 "zone_append": false, 00:22:48.029 "compare": false, 00:22:48.029 "compare_and_write": false, 00:22:48.029 "abort": true, 00:22:48.029 "seek_hole": false, 00:22:48.029 "seek_data": false, 00:22:48.029 "copy": true, 00:22:48.029 "nvme_iov_md": false 00:22:48.029 }, 00:22:48.029 "memory_domains": [ 00:22:48.029 { 00:22:48.029 "dma_device_id": "system", 00:22:48.029 "dma_device_type": 1 00:22:48.029 }, 00:22:48.029 { 00:22:48.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.029 "dma_device_type": 2 00:22:48.029 } 00:22:48.029 ], 00:22:48.029 "driver_specific": {} 00:22:48.029 }' 00:22:48.029 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.287 22:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.287 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:48.287 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:48.287 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:48.287 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:48.287 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:48.287 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:48.545 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:48.545 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:48.545 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:48.545 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:48.545 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:48.545 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:48.545 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:48.804 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:48.804 "name": "BaseBdev2", 00:22:48.804 "aliases": [ 00:22:48.804 "aac71433-9184-4c80-98a5-7160006b14f9" 00:22:48.804 ], 00:22:48.804 "product_name": "Malloc disk", 00:22:48.804 "block_size": 512, 00:22:48.804 "num_blocks": 65536, 00:22:48.804 "uuid": "aac71433-9184-4c80-98a5-7160006b14f9", 00:22:48.804 "assigned_rate_limits": { 00:22:48.804 "rw_ios_per_sec": 0, 00:22:48.804 "rw_mbytes_per_sec": 0, 00:22:48.804 "r_mbytes_per_sec": 0, 00:22:48.804 "w_mbytes_per_sec": 0 00:22:48.804 }, 00:22:48.804 "claimed": true, 00:22:48.804 "claim_type": "exclusive_write", 00:22:48.804 "zoned": false, 00:22:48.804 "supported_io_types": { 00:22:48.804 "read": true, 00:22:48.804 "write": true, 00:22:48.804 "unmap": true, 00:22:48.804 "flush": true, 00:22:48.804 "reset": true, 00:22:48.804 "nvme_admin": false, 00:22:48.804 "nvme_io": false, 00:22:48.804 "nvme_io_md": false, 00:22:48.804 "write_zeroes": true, 00:22:48.804 "zcopy": true, 00:22:48.804 "get_zone_info": false, 00:22:48.804 "zone_management": false, 00:22:48.804 "zone_append": false, 00:22:48.804 "compare": false, 00:22:48.804 "compare_and_write": false, 00:22:48.804 "abort": true, 00:22:48.804 "seek_hole": false, 00:22:48.804 "seek_data": false, 00:22:48.804 "copy": true, 00:22:48.804 "nvme_iov_md": false 00:22:48.804 }, 00:22:48.804 "memory_domains": [ 00:22:48.804 { 00:22:48.804 "dma_device_id": "system", 00:22:48.804 "dma_device_type": 1 00:22:48.804 }, 00:22:48.804 { 00:22:48.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:48.804 "dma_device_type": 2 00:22:48.804 } 00:22:48.804 ], 00:22:48.804 "driver_specific": {} 00:22:48.804 }' 00:22:48.804 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:48.804 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.063 22:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.320 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.320 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:49.320 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:49.320 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:49.320 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:49.606 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:49.606 "name": "BaseBdev3", 00:22:49.606 "aliases": [ 00:22:49.606 "367941a4-7527-44e8-b10b-4d0c801ea6d2" 00:22:49.606 ], 00:22:49.606 "product_name": "Malloc disk", 00:22:49.606 "block_size": 512, 00:22:49.606 "num_blocks": 65536, 00:22:49.606 "uuid": "367941a4-7527-44e8-b10b-4d0c801ea6d2", 00:22:49.606 "assigned_rate_limits": { 00:22:49.606 "rw_ios_per_sec": 0, 00:22:49.606 "rw_mbytes_per_sec": 0, 00:22:49.606 "r_mbytes_per_sec": 0, 00:22:49.606 "w_mbytes_per_sec": 0 00:22:49.606 }, 00:22:49.606 "claimed": true, 00:22:49.606 "claim_type": "exclusive_write", 00:22:49.606 "zoned": false, 00:22:49.606 "supported_io_types": { 00:22:49.607 "read": true, 00:22:49.607 "write": true, 00:22:49.607 "unmap": true, 00:22:49.607 "flush": true, 00:22:49.607 "reset": true, 00:22:49.607 "nvme_admin": false, 00:22:49.607 "nvme_io": false, 00:22:49.607 "nvme_io_md": false, 00:22:49.607 "write_zeroes": true, 00:22:49.607 "zcopy": true, 00:22:49.607 "get_zone_info": false, 00:22:49.607 "zone_management": false, 00:22:49.607 "zone_append": false, 00:22:49.607 "compare": false, 00:22:49.607 "compare_and_write": false, 00:22:49.607 "abort": true, 00:22:49.607 "seek_hole": false, 00:22:49.607 "seek_data": false, 00:22:49.607 "copy": true, 00:22:49.607 "nvme_iov_md": false 00:22:49.607 }, 00:22:49.607 "memory_domains": [ 00:22:49.607 { 00:22:49.607 "dma_device_id": "system", 00:22:49.607 "dma_device_type": 1 00:22:49.607 }, 00:22:49.607 { 00:22:49.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:49.607 "dma_device_type": 2 00:22:49.607 } 00:22:49.607 ], 00:22:49.607 "driver_specific": {} 00:22:49.607 }' 00:22:49.607 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.607 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:49.607 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:49.607 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.607 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:49.607 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:49.607 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:49.865 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:50.124 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:50.124 "name": "BaseBdev4", 00:22:50.124 "aliases": [ 00:22:50.124 "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac" 00:22:50.124 ], 00:22:50.124 "product_name": "Malloc disk", 00:22:50.124 "block_size": 512, 00:22:50.124 "num_blocks": 65536, 00:22:50.124 "uuid": "1990d208-dd8e-4ad5-a7c9-5ba5050d77ac", 00:22:50.124 "assigned_rate_limits": { 00:22:50.124 "rw_ios_per_sec": 0, 00:22:50.124 "rw_mbytes_per_sec": 0, 00:22:50.124 "r_mbytes_per_sec": 0, 00:22:50.124 "w_mbytes_per_sec": 0 00:22:50.124 }, 00:22:50.124 "claimed": true, 00:22:50.124 "claim_type": "exclusive_write", 00:22:50.124 "zoned": false, 00:22:50.124 "supported_io_types": { 00:22:50.124 "read": true, 00:22:50.124 "write": true, 00:22:50.124 "unmap": true, 00:22:50.124 "flush": true, 00:22:50.124 "reset": true, 00:22:50.124 "nvme_admin": false, 00:22:50.124 "nvme_io": false, 00:22:50.124 "nvme_io_md": false, 00:22:50.124 "write_zeroes": true, 00:22:50.124 "zcopy": true, 00:22:50.124 "get_zone_info": false, 00:22:50.124 "zone_management": false, 00:22:50.124 "zone_append": false, 00:22:50.124 "compare": false, 00:22:50.124 "compare_and_write": false, 00:22:50.124 "abort": true, 00:22:50.124 "seek_hole": false, 00:22:50.124 "seek_data": false, 00:22:50.124 "copy": true, 00:22:50.124 "nvme_iov_md": false 00:22:50.124 }, 00:22:50.124 "memory_domains": [ 00:22:50.124 { 00:22:50.124 "dma_device_id": "system", 00:22:50.124 "dma_device_type": 1 00:22:50.124 }, 00:22:50.124 { 00:22:50.124 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.124 "dma_device_type": 2 00:22:50.124 } 00:22:50.124 ], 00:22:50.124 "driver_specific": {} 00:22:50.124 }' 00:22:50.124 22:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.381 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:50.381 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:50.381 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.381 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:50.381 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:50.381 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.381 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:50.638 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:50.639 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:50.639 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:50.639 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:50.639 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:50.897 [2024-07-15 22:51:35.643307] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:50.897 [2024-07-15 22:51:35.643330] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:50.897 [2024-07-15 22:51:35.643378] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:50.897 [2024-07-15 22:51:35.643658] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:50.897 [2024-07-15 22:51:35.643671] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12e2180 name Existed_Raid, state offline 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2796255 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2796255 ']' 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2796255 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2796255 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2796255' 00:22:50.897 killing process with pid 2796255 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2796255 00:22:50.897 [2024-07-15 22:51:35.716461] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:50.897 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2796255 00:22:50.897 [2024-07-15 22:51:35.754098] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:51.156 22:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:51.156 00:22:51.156 real 0m35.672s 00:22:51.156 user 1m5.482s 00:22:51.156 sys 0m6.334s 00:22:51.156 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:51.156 22:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:51.156 ************************************ 00:22:51.156 END TEST raid_state_function_test_sb 00:22:51.156 ************************************ 00:22:51.156 22:51:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:51.156 22:51:36 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:22:51.156 22:51:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:51.156 22:51:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:51.156 22:51:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:51.414 ************************************ 00:22:51.414 START TEST raid_superblock_test 00:22:51.414 ************************************ 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2802023 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2802023 /var/tmp/spdk-raid.sock 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2802023 ']' 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:51.414 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:51.414 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:51.414 [2024-07-15 22:51:36.122608] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:22:51.414 [2024-07-15 22:51:36.122668] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2802023 ] 00:22:51.414 [2024-07-15 22:51:36.238726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:51.672 [2024-07-15 22:51:36.344707] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:51.672 [2024-07-15 22:51:36.403815] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:51.672 [2024-07-15 22:51:36.403848] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:52.238 22:51:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:52.238 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:52.496 malloc1 00:22:52.496 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:52.755 [2024-07-15 22:51:37.411412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:52.755 [2024-07-15 22:51:37.411468] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.755 [2024-07-15 22:51:37.411489] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ba570 00:22:52.755 [2024-07-15 22:51:37.411502] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.755 [2024-07-15 22:51:37.413280] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.755 [2024-07-15 22:51:37.413313] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:52.755 pt1 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:52.755 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:53.322 malloc2 00:22:53.322 22:51:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:53.322 [2024-07-15 22:51:38.182977] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:53.322 [2024-07-15 22:51:38.183033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.322 [2024-07-15 22:51:38.183051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12bb970 00:22:53.322 [2024-07-15 22:51:38.183065] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.322 [2024-07-15 22:51:38.184604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.322 [2024-07-15 22:51:38.184633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:53.322 pt2 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:53.322 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:53.580 malloc3 00:22:53.580 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:53.839 [2024-07-15 22:51:38.682905] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:53.839 [2024-07-15 22:51:38.682956] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:53.839 [2024-07-15 22:51:38.682974] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1452340 00:22:53.839 [2024-07-15 22:51:38.682987] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:53.839 [2024-07-15 22:51:38.684505] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:53.839 [2024-07-15 22:51:38.684533] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:53.839 pt3 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:53.839 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:54.098 malloc4 00:22:54.098 22:51:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:54.356 [2024-07-15 22:51:39.206826] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:54.356 [2024-07-15 22:51:39.206875] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.356 [2024-07-15 22:51:39.206896] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1454c60 00:22:54.356 [2024-07-15 22:51:39.206910] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.356 [2024-07-15 22:51:39.208414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.356 [2024-07-15 22:51:39.208442] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:54.356 pt4 00:22:54.356 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:54.356 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:54.356 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:54.614 [2024-07-15 22:51:39.455498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:54.614 [2024-07-15 22:51:39.456731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:54.614 [2024-07-15 22:51:39.456789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:54.614 [2024-07-15 22:51:39.456836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:54.614 [2024-07-15 22:51:39.457014] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b2530 00:22:54.614 [2024-07-15 22:51:39.457026] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:54.614 [2024-07-15 22:51:39.457215] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b0770 00:22:54.614 [2024-07-15 22:51:39.457374] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b2530 00:22:54.614 [2024-07-15 22:51:39.457385] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b2530 00:22:54.614 [2024-07-15 22:51:39.457483] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.614 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.927 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.927 "name": "raid_bdev1", 00:22:54.927 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:22:54.927 "strip_size_kb": 0, 00:22:54.927 "state": "online", 00:22:54.927 "raid_level": "raid1", 00:22:54.927 "superblock": true, 00:22:54.927 "num_base_bdevs": 4, 00:22:54.927 "num_base_bdevs_discovered": 4, 00:22:54.927 "num_base_bdevs_operational": 4, 00:22:54.927 "base_bdevs_list": [ 00:22:54.927 { 00:22:54.927 "name": "pt1", 00:22:54.927 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:54.927 "is_configured": true, 00:22:54.927 "data_offset": 2048, 00:22:54.927 "data_size": 63488 00:22:54.927 }, 00:22:54.927 { 00:22:54.927 "name": "pt2", 00:22:54.927 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:54.927 "is_configured": true, 00:22:54.927 "data_offset": 2048, 00:22:54.927 "data_size": 63488 00:22:54.927 }, 00:22:54.927 { 00:22:54.927 "name": "pt3", 00:22:54.927 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:54.927 "is_configured": true, 00:22:54.927 "data_offset": 2048, 00:22:54.927 "data_size": 63488 00:22:54.927 }, 00:22:54.927 { 00:22:54.927 "name": "pt4", 00:22:54.927 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:54.927 "is_configured": true, 00:22:54.927 "data_offset": 2048, 00:22:54.927 "data_size": 63488 00:22:54.927 } 00:22:54.927 ] 00:22:54.927 }' 00:22:54.927 22:51:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.927 22:51:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:55.867 [2024-07-15 22:51:40.699103] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:55.867 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:55.867 "name": "raid_bdev1", 00:22:55.867 "aliases": [ 00:22:55.867 "0816399a-baed-4658-91a2-3c0fc970c6a6" 00:22:55.867 ], 00:22:55.867 "product_name": "Raid Volume", 00:22:55.867 "block_size": 512, 00:22:55.867 "num_blocks": 63488, 00:22:55.867 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:22:55.867 "assigned_rate_limits": { 00:22:55.867 "rw_ios_per_sec": 0, 00:22:55.867 "rw_mbytes_per_sec": 0, 00:22:55.867 "r_mbytes_per_sec": 0, 00:22:55.867 "w_mbytes_per_sec": 0 00:22:55.867 }, 00:22:55.867 "claimed": false, 00:22:55.867 "zoned": false, 00:22:55.867 "supported_io_types": { 00:22:55.867 "read": true, 00:22:55.867 "write": true, 00:22:55.867 "unmap": false, 00:22:55.867 "flush": false, 00:22:55.867 "reset": true, 00:22:55.867 "nvme_admin": false, 00:22:55.867 "nvme_io": false, 00:22:55.867 "nvme_io_md": false, 00:22:55.867 "write_zeroes": true, 00:22:55.867 "zcopy": false, 00:22:55.867 "get_zone_info": false, 00:22:55.867 "zone_management": false, 00:22:55.867 "zone_append": false, 00:22:55.867 "compare": false, 00:22:55.867 "compare_and_write": false, 00:22:55.867 "abort": false, 00:22:55.867 "seek_hole": false, 00:22:55.867 "seek_data": false, 00:22:55.867 "copy": false, 00:22:55.867 "nvme_iov_md": false 00:22:55.867 }, 00:22:55.867 "memory_domains": [ 00:22:55.867 { 00:22:55.867 "dma_device_id": "system", 00:22:55.867 "dma_device_type": 1 00:22:55.867 }, 00:22:55.867 { 00:22:55.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.867 "dma_device_type": 2 00:22:55.867 }, 00:22:55.867 { 00:22:55.867 "dma_device_id": "system", 00:22:55.867 "dma_device_type": 1 00:22:55.867 }, 00:22:55.867 { 00:22:55.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.867 "dma_device_type": 2 00:22:55.867 }, 00:22:55.867 { 00:22:55.867 "dma_device_id": "system", 00:22:55.867 "dma_device_type": 1 00:22:55.867 }, 00:22:55.867 { 00:22:55.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.867 "dma_device_type": 2 00:22:55.867 }, 00:22:55.867 { 00:22:55.867 "dma_device_id": "system", 00:22:55.868 "dma_device_type": 1 00:22:55.868 }, 00:22:55.868 { 00:22:55.868 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.868 "dma_device_type": 2 00:22:55.868 } 00:22:55.868 ], 00:22:55.868 "driver_specific": { 00:22:55.868 "raid": { 00:22:55.868 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:22:55.868 "strip_size_kb": 0, 00:22:55.868 "state": "online", 00:22:55.868 "raid_level": "raid1", 00:22:55.868 "superblock": true, 00:22:55.868 "num_base_bdevs": 4, 00:22:55.868 "num_base_bdevs_discovered": 4, 00:22:55.868 "num_base_bdevs_operational": 4, 00:22:55.868 "base_bdevs_list": [ 00:22:55.868 { 00:22:55.868 "name": "pt1", 00:22:55.868 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:55.868 "is_configured": true, 00:22:55.868 "data_offset": 2048, 00:22:55.868 "data_size": 63488 00:22:55.868 }, 00:22:55.868 { 00:22:55.868 "name": "pt2", 00:22:55.868 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:55.868 "is_configured": true, 00:22:55.868 "data_offset": 2048, 00:22:55.868 "data_size": 63488 00:22:55.868 }, 00:22:55.868 { 00:22:55.868 "name": "pt3", 00:22:55.868 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:55.868 "is_configured": true, 00:22:55.868 "data_offset": 2048, 00:22:55.868 "data_size": 63488 00:22:55.868 }, 00:22:55.868 { 00:22:55.868 "name": "pt4", 00:22:55.868 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:55.868 "is_configured": true, 00:22:55.868 "data_offset": 2048, 00:22:55.868 "data_size": 63488 00:22:55.868 } 00:22:55.868 ] 00:22:55.868 } 00:22:55.868 } 00:22:55.868 }' 00:22:55.868 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:55.868 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:55.868 pt2 00:22:55.868 pt3 00:22:55.868 pt4' 00:22:55.868 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:56.128 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:56.128 22:51:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:56.128 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:56.128 "name": "pt1", 00:22:56.128 "aliases": [ 00:22:56.128 "00000000-0000-0000-0000-000000000001" 00:22:56.128 ], 00:22:56.128 "product_name": "passthru", 00:22:56.128 "block_size": 512, 00:22:56.128 "num_blocks": 65536, 00:22:56.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:56.128 "assigned_rate_limits": { 00:22:56.128 "rw_ios_per_sec": 0, 00:22:56.128 "rw_mbytes_per_sec": 0, 00:22:56.128 "r_mbytes_per_sec": 0, 00:22:56.128 "w_mbytes_per_sec": 0 00:22:56.128 }, 00:22:56.128 "claimed": true, 00:22:56.128 "claim_type": "exclusive_write", 00:22:56.128 "zoned": false, 00:22:56.128 "supported_io_types": { 00:22:56.128 "read": true, 00:22:56.128 "write": true, 00:22:56.128 "unmap": true, 00:22:56.128 "flush": true, 00:22:56.128 "reset": true, 00:22:56.128 "nvme_admin": false, 00:22:56.128 "nvme_io": false, 00:22:56.128 "nvme_io_md": false, 00:22:56.128 "write_zeroes": true, 00:22:56.128 "zcopy": true, 00:22:56.128 "get_zone_info": false, 00:22:56.128 "zone_management": false, 00:22:56.128 "zone_append": false, 00:22:56.128 "compare": false, 00:22:56.128 "compare_and_write": false, 00:22:56.128 "abort": true, 00:22:56.128 "seek_hole": false, 00:22:56.128 "seek_data": false, 00:22:56.128 "copy": true, 00:22:56.128 "nvme_iov_md": false 00:22:56.128 }, 00:22:56.128 "memory_domains": [ 00:22:56.128 { 00:22:56.128 "dma_device_id": "system", 00:22:56.128 "dma_device_type": 1 00:22:56.128 }, 00:22:56.128 { 00:22:56.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.128 "dma_device_type": 2 00:22:56.128 } 00:22:56.128 ], 00:22:56.128 "driver_specific": { 00:22:56.128 "passthru": { 00:22:56.128 "name": "pt1", 00:22:56.128 "base_bdev_name": "malloc1" 00:22:56.128 } 00:22:56.128 } 00:22:56.128 }' 00:22:56.128 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:56.387 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:56.646 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:56.646 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:56.646 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:56.646 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:56.646 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:57.215 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:57.215 "name": "pt2", 00:22:57.215 "aliases": [ 00:22:57.215 "00000000-0000-0000-0000-000000000002" 00:22:57.215 ], 00:22:57.215 "product_name": "passthru", 00:22:57.215 "block_size": 512, 00:22:57.215 "num_blocks": 65536, 00:22:57.215 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:57.215 "assigned_rate_limits": { 00:22:57.215 "rw_ios_per_sec": 0, 00:22:57.215 "rw_mbytes_per_sec": 0, 00:22:57.215 "r_mbytes_per_sec": 0, 00:22:57.215 "w_mbytes_per_sec": 0 00:22:57.215 }, 00:22:57.215 "claimed": true, 00:22:57.215 "claim_type": "exclusive_write", 00:22:57.215 "zoned": false, 00:22:57.215 "supported_io_types": { 00:22:57.215 "read": true, 00:22:57.215 "write": true, 00:22:57.215 "unmap": true, 00:22:57.215 "flush": true, 00:22:57.215 "reset": true, 00:22:57.215 "nvme_admin": false, 00:22:57.215 "nvme_io": false, 00:22:57.215 "nvme_io_md": false, 00:22:57.215 "write_zeroes": true, 00:22:57.215 "zcopy": true, 00:22:57.215 "get_zone_info": false, 00:22:57.215 "zone_management": false, 00:22:57.215 "zone_append": false, 00:22:57.215 "compare": false, 00:22:57.215 "compare_and_write": false, 00:22:57.215 "abort": true, 00:22:57.215 "seek_hole": false, 00:22:57.215 "seek_data": false, 00:22:57.215 "copy": true, 00:22:57.215 "nvme_iov_md": false 00:22:57.215 }, 00:22:57.215 "memory_domains": [ 00:22:57.215 { 00:22:57.215 "dma_device_id": "system", 00:22:57.215 "dma_device_type": 1 00:22:57.215 }, 00:22:57.215 { 00:22:57.215 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.215 "dma_device_type": 2 00:22:57.215 } 00:22:57.215 ], 00:22:57.215 "driver_specific": { 00:22:57.215 "passthru": { 00:22:57.215 "name": "pt2", 00:22:57.215 "base_bdev_name": "malloc2" 00:22:57.215 } 00:22:57.215 } 00:22:57.215 }' 00:22:57.215 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.215 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.215 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:57.215 22:51:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.215 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.215 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:57.215 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:57.474 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:57.734 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:57.734 "name": "pt3", 00:22:57.734 "aliases": [ 00:22:57.734 "00000000-0000-0000-0000-000000000003" 00:22:57.734 ], 00:22:57.734 "product_name": "passthru", 00:22:57.734 "block_size": 512, 00:22:57.734 "num_blocks": 65536, 00:22:57.734 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:57.734 "assigned_rate_limits": { 00:22:57.734 "rw_ios_per_sec": 0, 00:22:57.734 "rw_mbytes_per_sec": 0, 00:22:57.734 "r_mbytes_per_sec": 0, 00:22:57.734 "w_mbytes_per_sec": 0 00:22:57.734 }, 00:22:57.734 "claimed": true, 00:22:57.734 "claim_type": "exclusive_write", 00:22:57.734 "zoned": false, 00:22:57.734 "supported_io_types": { 00:22:57.734 "read": true, 00:22:57.734 "write": true, 00:22:57.734 "unmap": true, 00:22:57.734 "flush": true, 00:22:57.734 "reset": true, 00:22:57.734 "nvme_admin": false, 00:22:57.734 "nvme_io": false, 00:22:57.734 "nvme_io_md": false, 00:22:57.734 "write_zeroes": true, 00:22:57.734 "zcopy": true, 00:22:57.734 "get_zone_info": false, 00:22:57.734 "zone_management": false, 00:22:57.734 "zone_append": false, 00:22:57.734 "compare": false, 00:22:57.734 "compare_and_write": false, 00:22:57.734 "abort": true, 00:22:57.734 "seek_hole": false, 00:22:57.734 "seek_data": false, 00:22:57.734 "copy": true, 00:22:57.734 "nvme_iov_md": false 00:22:57.734 }, 00:22:57.734 "memory_domains": [ 00:22:57.734 { 00:22:57.734 "dma_device_id": "system", 00:22:57.734 "dma_device_type": 1 00:22:57.734 }, 00:22:57.734 { 00:22:57.734 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:57.734 "dma_device_type": 2 00:22:57.734 } 00:22:57.734 ], 00:22:57.734 "driver_specific": { 00:22:57.734 "passthru": { 00:22:57.734 "name": "pt3", 00:22:57.734 "base_bdev_name": "malloc3" 00:22:57.734 } 00:22:57.734 } 00:22:57.734 }' 00:22:57.734 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:57.993 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.252 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.252 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:58.252 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:58.252 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:58.252 22:51:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:58.513 "name": "pt4", 00:22:58.513 "aliases": [ 00:22:58.513 "00000000-0000-0000-0000-000000000004" 00:22:58.513 ], 00:22:58.513 "product_name": "passthru", 00:22:58.513 "block_size": 512, 00:22:58.513 "num_blocks": 65536, 00:22:58.513 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:58.513 "assigned_rate_limits": { 00:22:58.513 "rw_ios_per_sec": 0, 00:22:58.513 "rw_mbytes_per_sec": 0, 00:22:58.513 "r_mbytes_per_sec": 0, 00:22:58.513 "w_mbytes_per_sec": 0 00:22:58.513 }, 00:22:58.513 "claimed": true, 00:22:58.513 "claim_type": "exclusive_write", 00:22:58.513 "zoned": false, 00:22:58.513 "supported_io_types": { 00:22:58.513 "read": true, 00:22:58.513 "write": true, 00:22:58.513 "unmap": true, 00:22:58.513 "flush": true, 00:22:58.513 "reset": true, 00:22:58.513 "nvme_admin": false, 00:22:58.513 "nvme_io": false, 00:22:58.513 "nvme_io_md": false, 00:22:58.513 "write_zeroes": true, 00:22:58.513 "zcopy": true, 00:22:58.513 "get_zone_info": false, 00:22:58.513 "zone_management": false, 00:22:58.513 "zone_append": false, 00:22:58.513 "compare": false, 00:22:58.513 "compare_and_write": false, 00:22:58.513 "abort": true, 00:22:58.513 "seek_hole": false, 00:22:58.513 "seek_data": false, 00:22:58.513 "copy": true, 00:22:58.513 "nvme_iov_md": false 00:22:58.513 }, 00:22:58.513 "memory_domains": [ 00:22:58.513 { 00:22:58.513 "dma_device_id": "system", 00:22:58.513 "dma_device_type": 1 00:22:58.513 }, 00:22:58.513 { 00:22:58.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:58.513 "dma_device_type": 2 00:22:58.513 } 00:22:58.513 ], 00:22:58.513 "driver_specific": { 00:22:58.513 "passthru": { 00:22:58.513 "name": "pt4", 00:22:58.513 "base_bdev_name": "malloc4" 00:22:58.513 } 00:22:58.513 } 00:22:58.513 }' 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:58.513 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:58.774 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:58.774 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:58.774 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.774 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:58.774 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:58.774 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:58.774 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:59.032 [2024-07-15 22:51:43.783277] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:59.032 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0816399a-baed-4658-91a2-3c0fc970c6a6 00:22:59.032 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0816399a-baed-4658-91a2-3c0fc970c6a6 ']' 00:22:59.032 22:51:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:59.290 [2024-07-15 22:51:44.023615] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:59.290 [2024-07-15 22:51:44.023640] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:59.290 [2024-07-15 22:51:44.023694] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:59.290 [2024-07-15 22:51:44.023785] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:59.290 [2024-07-15 22:51:44.023798] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b2530 name raid_bdev1, state offline 00:22:59.290 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.290 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:59.548 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:59.548 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:59.548 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:59.548 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:59.806 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:59.806 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:00.065 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:00.065 22:51:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:00.324 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:00.324 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:00.582 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:00.582 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:00.582 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:00.583 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:00.583 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:23:00.583 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:00.583 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:00.841 [2024-07-15 22:51:45.720035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:00.841 [2024-07-15 22:51:45.721595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:00.841 [2024-07-15 22:51:45.721644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:23:00.841 [2024-07-15 22:51:45.721682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:23:00.841 [2024-07-15 22:51:45.721732] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:00.841 [2024-07-15 22:51:45.721772] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:00.841 [2024-07-15 22:51:45.721803] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:23:00.841 [2024-07-15 22:51:45.721826] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:23:00.841 [2024-07-15 22:51:45.721845] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:00.841 [2024-07-15 22:51:45.721856] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x145dff0 name raid_bdev1, state configuring 00:23:00.841 request: 00:23:00.841 { 00:23:00.841 "name": "raid_bdev1", 00:23:00.841 "raid_level": "raid1", 00:23:00.841 "base_bdevs": [ 00:23:00.841 "malloc1", 00:23:00.841 "malloc2", 00:23:00.841 "malloc3", 00:23:00.841 "malloc4" 00:23:00.841 ], 00:23:00.841 "superblock": false, 00:23:00.841 "method": "bdev_raid_create", 00:23:00.841 "req_id": 1 00:23:00.841 } 00:23:00.841 Got JSON-RPC error response 00:23:00.841 response: 00:23:00.841 { 00:23:00.841 "code": -17, 00:23:00.841 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:00.841 } 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:23:00.841 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:00.842 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:00.842 22:51:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:00.842 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.842 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:01.100 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:01.100 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:01.100 22:51:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:01.359 [2024-07-15 22:51:46.205416] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:01.359 [2024-07-15 22:51:46.205467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:01.359 [2024-07-15 22:51:46.205488] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ba7a0 00:23:01.359 [2024-07-15 22:51:46.205501] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:01.359 [2024-07-15 22:51:46.207483] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:01.359 [2024-07-15 22:51:46.207514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:01.359 [2024-07-15 22:51:46.207586] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:01.359 [2024-07-15 22:51:46.207614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:01.359 pt1 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.359 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.618 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.619 "name": "raid_bdev1", 00:23:01.619 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:01.619 "strip_size_kb": 0, 00:23:01.619 "state": "configuring", 00:23:01.619 "raid_level": "raid1", 00:23:01.619 "superblock": true, 00:23:01.619 "num_base_bdevs": 4, 00:23:01.619 "num_base_bdevs_discovered": 1, 00:23:01.619 "num_base_bdevs_operational": 4, 00:23:01.619 "base_bdevs_list": [ 00:23:01.619 { 00:23:01.619 "name": "pt1", 00:23:01.619 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:01.619 "is_configured": true, 00:23:01.619 "data_offset": 2048, 00:23:01.619 "data_size": 63488 00:23:01.619 }, 00:23:01.619 { 00:23:01.619 "name": null, 00:23:01.619 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:01.619 "is_configured": false, 00:23:01.619 "data_offset": 2048, 00:23:01.619 "data_size": 63488 00:23:01.619 }, 00:23:01.619 { 00:23:01.619 "name": null, 00:23:01.619 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:01.619 "is_configured": false, 00:23:01.619 "data_offset": 2048, 00:23:01.619 "data_size": 63488 00:23:01.619 }, 00:23:01.619 { 00:23:01.619 "name": null, 00:23:01.619 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:01.619 "is_configured": false, 00:23:01.619 "data_offset": 2048, 00:23:01.619 "data_size": 63488 00:23:01.619 } 00:23:01.619 ] 00:23:01.619 }' 00:23:01.619 22:51:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.619 22:51:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:02.187 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:23:02.187 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:02.446 [2024-07-15 22:51:47.292303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:02.446 [2024-07-15 22:51:47.292356] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:02.446 [2024-07-15 22:51:47.292377] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1453940 00:23:02.446 [2024-07-15 22:51:47.292390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:02.446 [2024-07-15 22:51:47.292765] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:02.446 [2024-07-15 22:51:47.292783] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:02.446 [2024-07-15 22:51:47.292852] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:02.446 [2024-07-15 22:51:47.292871] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:02.446 pt2 00:23:02.446 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:02.706 [2024-07-15 22:51:47.532967] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.706 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.965 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.965 "name": "raid_bdev1", 00:23:02.965 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:02.965 "strip_size_kb": 0, 00:23:02.965 "state": "configuring", 00:23:02.965 "raid_level": "raid1", 00:23:02.965 "superblock": true, 00:23:02.965 "num_base_bdevs": 4, 00:23:02.965 "num_base_bdevs_discovered": 1, 00:23:02.965 "num_base_bdevs_operational": 4, 00:23:02.965 "base_bdevs_list": [ 00:23:02.965 { 00:23:02.965 "name": "pt1", 00:23:02.965 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:02.965 "is_configured": true, 00:23:02.965 "data_offset": 2048, 00:23:02.965 "data_size": 63488 00:23:02.965 }, 00:23:02.965 { 00:23:02.965 "name": null, 00:23:02.965 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:02.965 "is_configured": false, 00:23:02.965 "data_offset": 2048, 00:23:02.965 "data_size": 63488 00:23:02.965 }, 00:23:02.965 { 00:23:02.965 "name": null, 00:23:02.965 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:02.965 "is_configured": false, 00:23:02.965 "data_offset": 2048, 00:23:02.965 "data_size": 63488 00:23:02.965 }, 00:23:02.965 { 00:23:02.965 "name": null, 00:23:02.965 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:02.965 "is_configured": false, 00:23:02.965 "data_offset": 2048, 00:23:02.965 "data_size": 63488 00:23:02.965 } 00:23:02.965 ] 00:23:02.965 }' 00:23:02.965 22:51:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.965 22:51:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.903 22:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:03.903 22:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:03.903 22:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:03.903 [2024-07-15 22:51:48.675993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:03.903 [2024-07-15 22:51:48.676046] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.903 [2024-07-15 22:51:48.676065] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b1060 00:23:03.903 [2024-07-15 22:51:48.676078] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.903 [2024-07-15 22:51:48.676454] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.903 [2024-07-15 22:51:48.676472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:03.903 [2024-07-15 22:51:48.676541] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:03.903 [2024-07-15 22:51:48.676561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:03.903 pt2 00:23:03.903 22:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:03.903 22:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:03.903 22:51:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:04.473 [2024-07-15 22:51:49.193363] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:04.473 [2024-07-15 22:51:49.193408] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.473 [2024-07-15 22:51:49.193429] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b38d0 00:23:04.473 [2024-07-15 22:51:49.193442] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.473 [2024-07-15 22:51:49.193796] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.473 [2024-07-15 22:51:49.193814] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:04.473 [2024-07-15 22:51:49.193876] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:04.473 [2024-07-15 22:51:49.193894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:04.473 pt3 00:23:04.473 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:04.473 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:04.473 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:05.061 [2024-07-15 22:51:49.710735] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:05.061 [2024-07-15 22:51:49.710777] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:05.061 [2024-07-15 22:51:49.710795] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b4b80 00:23:05.061 [2024-07-15 22:51:49.710807] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:05.061 [2024-07-15 22:51:49.711169] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:05.061 [2024-07-15 22:51:49.711188] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:05.061 [2024-07-15 22:51:49.711251] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:05.061 [2024-07-15 22:51:49.711270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:05.061 [2024-07-15 22:51:49.711402] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b1780 00:23:05.061 [2024-07-15 22:51:49.711413] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:05.061 [2024-07-15 22:51:49.711592] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b6fa0 00:23:05.061 [2024-07-15 22:51:49.711740] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b1780 00:23:05.061 [2024-07-15 22:51:49.711751] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b1780 00:23:05.061 [2024-07-15 22:51:49.711855] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:05.061 pt4 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.061 22:51:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.319 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.319 "name": "raid_bdev1", 00:23:05.319 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:05.319 "strip_size_kb": 0, 00:23:05.319 "state": "online", 00:23:05.319 "raid_level": "raid1", 00:23:05.319 "superblock": true, 00:23:05.319 "num_base_bdevs": 4, 00:23:05.319 "num_base_bdevs_discovered": 4, 00:23:05.319 "num_base_bdevs_operational": 4, 00:23:05.319 "base_bdevs_list": [ 00:23:05.319 { 00:23:05.319 "name": "pt1", 00:23:05.319 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:05.319 "is_configured": true, 00:23:05.319 "data_offset": 2048, 00:23:05.319 "data_size": 63488 00:23:05.319 }, 00:23:05.319 { 00:23:05.319 "name": "pt2", 00:23:05.319 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:05.319 "is_configured": true, 00:23:05.319 "data_offset": 2048, 00:23:05.319 "data_size": 63488 00:23:05.319 }, 00:23:05.319 { 00:23:05.319 "name": "pt3", 00:23:05.319 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:05.319 "is_configured": true, 00:23:05.319 "data_offset": 2048, 00:23:05.319 "data_size": 63488 00:23:05.319 }, 00:23:05.319 { 00:23:05.319 "name": "pt4", 00:23:05.319 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:05.319 "is_configured": true, 00:23:05.319 "data_offset": 2048, 00:23:05.319 "data_size": 63488 00:23:05.319 } 00:23:05.319 ] 00:23:05.319 }' 00:23:05.319 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.319 22:51:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:05.882 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:06.140 [2024-07-15 22:51:50.825990] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:06.140 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:06.140 "name": "raid_bdev1", 00:23:06.140 "aliases": [ 00:23:06.140 "0816399a-baed-4658-91a2-3c0fc970c6a6" 00:23:06.140 ], 00:23:06.140 "product_name": "Raid Volume", 00:23:06.140 "block_size": 512, 00:23:06.140 "num_blocks": 63488, 00:23:06.140 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:06.140 "assigned_rate_limits": { 00:23:06.140 "rw_ios_per_sec": 0, 00:23:06.140 "rw_mbytes_per_sec": 0, 00:23:06.140 "r_mbytes_per_sec": 0, 00:23:06.140 "w_mbytes_per_sec": 0 00:23:06.140 }, 00:23:06.140 "claimed": false, 00:23:06.140 "zoned": false, 00:23:06.140 "supported_io_types": { 00:23:06.140 "read": true, 00:23:06.140 "write": true, 00:23:06.140 "unmap": false, 00:23:06.140 "flush": false, 00:23:06.140 "reset": true, 00:23:06.140 "nvme_admin": false, 00:23:06.140 "nvme_io": false, 00:23:06.140 "nvme_io_md": false, 00:23:06.140 "write_zeroes": true, 00:23:06.140 "zcopy": false, 00:23:06.140 "get_zone_info": false, 00:23:06.140 "zone_management": false, 00:23:06.140 "zone_append": false, 00:23:06.140 "compare": false, 00:23:06.140 "compare_and_write": false, 00:23:06.140 "abort": false, 00:23:06.140 "seek_hole": false, 00:23:06.140 "seek_data": false, 00:23:06.140 "copy": false, 00:23:06.140 "nvme_iov_md": false 00:23:06.140 }, 00:23:06.140 "memory_domains": [ 00:23:06.140 { 00:23:06.140 "dma_device_id": "system", 00:23:06.140 "dma_device_type": 1 00:23:06.140 }, 00:23:06.140 { 00:23:06.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.140 "dma_device_type": 2 00:23:06.140 }, 00:23:06.140 { 00:23:06.140 "dma_device_id": "system", 00:23:06.140 "dma_device_type": 1 00:23:06.140 }, 00:23:06.140 { 00:23:06.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.140 "dma_device_type": 2 00:23:06.140 }, 00:23:06.140 { 00:23:06.140 "dma_device_id": "system", 00:23:06.141 "dma_device_type": 1 00:23:06.141 }, 00:23:06.141 { 00:23:06.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.141 "dma_device_type": 2 00:23:06.141 }, 00:23:06.141 { 00:23:06.141 "dma_device_id": "system", 00:23:06.141 "dma_device_type": 1 00:23:06.141 }, 00:23:06.141 { 00:23:06.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.141 "dma_device_type": 2 00:23:06.141 } 00:23:06.141 ], 00:23:06.141 "driver_specific": { 00:23:06.141 "raid": { 00:23:06.141 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:06.141 "strip_size_kb": 0, 00:23:06.141 "state": "online", 00:23:06.141 "raid_level": "raid1", 00:23:06.141 "superblock": true, 00:23:06.141 "num_base_bdevs": 4, 00:23:06.141 "num_base_bdevs_discovered": 4, 00:23:06.141 "num_base_bdevs_operational": 4, 00:23:06.141 "base_bdevs_list": [ 00:23:06.141 { 00:23:06.141 "name": "pt1", 00:23:06.141 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:06.141 "is_configured": true, 00:23:06.141 "data_offset": 2048, 00:23:06.141 "data_size": 63488 00:23:06.141 }, 00:23:06.141 { 00:23:06.141 "name": "pt2", 00:23:06.141 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:06.141 "is_configured": true, 00:23:06.141 "data_offset": 2048, 00:23:06.141 "data_size": 63488 00:23:06.141 }, 00:23:06.141 { 00:23:06.141 "name": "pt3", 00:23:06.141 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:06.141 "is_configured": true, 00:23:06.141 "data_offset": 2048, 00:23:06.141 "data_size": 63488 00:23:06.141 }, 00:23:06.141 { 00:23:06.141 "name": "pt4", 00:23:06.141 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:06.141 "is_configured": true, 00:23:06.141 "data_offset": 2048, 00:23:06.141 "data_size": 63488 00:23:06.141 } 00:23:06.141 ] 00:23:06.141 } 00:23:06.141 } 00:23:06.141 }' 00:23:06.141 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:06.141 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:06.141 pt2 00:23:06.141 pt3 00:23:06.141 pt4' 00:23:06.141 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:06.141 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:06.141 22:51:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:06.399 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:06.399 "name": "pt1", 00:23:06.399 "aliases": [ 00:23:06.399 "00000000-0000-0000-0000-000000000001" 00:23:06.399 ], 00:23:06.399 "product_name": "passthru", 00:23:06.399 "block_size": 512, 00:23:06.399 "num_blocks": 65536, 00:23:06.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:06.399 "assigned_rate_limits": { 00:23:06.399 "rw_ios_per_sec": 0, 00:23:06.399 "rw_mbytes_per_sec": 0, 00:23:06.399 "r_mbytes_per_sec": 0, 00:23:06.399 "w_mbytes_per_sec": 0 00:23:06.399 }, 00:23:06.399 "claimed": true, 00:23:06.399 "claim_type": "exclusive_write", 00:23:06.399 "zoned": false, 00:23:06.399 "supported_io_types": { 00:23:06.399 "read": true, 00:23:06.399 "write": true, 00:23:06.399 "unmap": true, 00:23:06.399 "flush": true, 00:23:06.399 "reset": true, 00:23:06.399 "nvme_admin": false, 00:23:06.399 "nvme_io": false, 00:23:06.399 "nvme_io_md": false, 00:23:06.399 "write_zeroes": true, 00:23:06.399 "zcopy": true, 00:23:06.399 "get_zone_info": false, 00:23:06.399 "zone_management": false, 00:23:06.399 "zone_append": false, 00:23:06.399 "compare": false, 00:23:06.399 "compare_and_write": false, 00:23:06.399 "abort": true, 00:23:06.399 "seek_hole": false, 00:23:06.399 "seek_data": false, 00:23:06.399 "copy": true, 00:23:06.399 "nvme_iov_md": false 00:23:06.399 }, 00:23:06.399 "memory_domains": [ 00:23:06.399 { 00:23:06.399 "dma_device_id": "system", 00:23:06.399 "dma_device_type": 1 00:23:06.399 }, 00:23:06.399 { 00:23:06.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.399 "dma_device_type": 2 00:23:06.399 } 00:23:06.399 ], 00:23:06.399 "driver_specific": { 00:23:06.399 "passthru": { 00:23:06.399 "name": "pt1", 00:23:06.399 "base_bdev_name": "malloc1" 00:23:06.399 } 00:23:06.399 } 00:23:06.399 }' 00:23:06.399 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.399 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.399 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:06.399 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.399 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:06.657 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:06.915 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:06.915 "name": "pt2", 00:23:06.915 "aliases": [ 00:23:06.915 "00000000-0000-0000-0000-000000000002" 00:23:06.915 ], 00:23:06.915 "product_name": "passthru", 00:23:06.915 "block_size": 512, 00:23:06.915 "num_blocks": 65536, 00:23:06.915 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:06.915 "assigned_rate_limits": { 00:23:06.915 "rw_ios_per_sec": 0, 00:23:06.915 "rw_mbytes_per_sec": 0, 00:23:06.915 "r_mbytes_per_sec": 0, 00:23:06.915 "w_mbytes_per_sec": 0 00:23:06.915 }, 00:23:06.915 "claimed": true, 00:23:06.915 "claim_type": "exclusive_write", 00:23:06.915 "zoned": false, 00:23:06.915 "supported_io_types": { 00:23:06.915 "read": true, 00:23:06.915 "write": true, 00:23:06.915 "unmap": true, 00:23:06.915 "flush": true, 00:23:06.915 "reset": true, 00:23:06.915 "nvme_admin": false, 00:23:06.915 "nvme_io": false, 00:23:06.915 "nvme_io_md": false, 00:23:06.915 "write_zeroes": true, 00:23:06.915 "zcopy": true, 00:23:06.915 "get_zone_info": false, 00:23:06.915 "zone_management": false, 00:23:06.915 "zone_append": false, 00:23:06.915 "compare": false, 00:23:06.915 "compare_and_write": false, 00:23:06.915 "abort": true, 00:23:06.915 "seek_hole": false, 00:23:06.915 "seek_data": false, 00:23:06.915 "copy": true, 00:23:06.915 "nvme_iov_md": false 00:23:06.915 }, 00:23:06.915 "memory_domains": [ 00:23:06.915 { 00:23:06.915 "dma_device_id": "system", 00:23:06.915 "dma_device_type": 1 00:23:06.915 }, 00:23:06.915 { 00:23:06.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:06.915 "dma_device_type": 2 00:23:06.915 } 00:23:06.915 ], 00:23:06.915 "driver_specific": { 00:23:06.915 "passthru": { 00:23:06.915 "name": "pt2", 00:23:06.915 "base_bdev_name": "malloc2" 00:23:06.915 } 00:23:06.915 } 00:23:06.915 }' 00:23:06.916 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.916 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:06.916 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:06.916 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.916 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:07.173 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:07.173 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.173 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.173 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:07.173 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:07.173 22:51:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:07.173 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:07.173 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:07.173 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:07.173 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:07.738 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:07.738 "name": "pt3", 00:23:07.738 "aliases": [ 00:23:07.738 "00000000-0000-0000-0000-000000000003" 00:23:07.738 ], 00:23:07.738 "product_name": "passthru", 00:23:07.738 "block_size": 512, 00:23:07.738 "num_blocks": 65536, 00:23:07.738 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:07.738 "assigned_rate_limits": { 00:23:07.738 "rw_ios_per_sec": 0, 00:23:07.738 "rw_mbytes_per_sec": 0, 00:23:07.738 "r_mbytes_per_sec": 0, 00:23:07.738 "w_mbytes_per_sec": 0 00:23:07.738 }, 00:23:07.738 "claimed": true, 00:23:07.738 "claim_type": "exclusive_write", 00:23:07.738 "zoned": false, 00:23:07.738 "supported_io_types": { 00:23:07.738 "read": true, 00:23:07.738 "write": true, 00:23:07.738 "unmap": true, 00:23:07.738 "flush": true, 00:23:07.738 "reset": true, 00:23:07.738 "nvme_admin": false, 00:23:07.738 "nvme_io": false, 00:23:07.738 "nvme_io_md": false, 00:23:07.738 "write_zeroes": true, 00:23:07.738 "zcopy": true, 00:23:07.738 "get_zone_info": false, 00:23:07.738 "zone_management": false, 00:23:07.738 "zone_append": false, 00:23:07.738 "compare": false, 00:23:07.738 "compare_and_write": false, 00:23:07.738 "abort": true, 00:23:07.738 "seek_hole": false, 00:23:07.738 "seek_data": false, 00:23:07.738 "copy": true, 00:23:07.738 "nvme_iov_md": false 00:23:07.738 }, 00:23:07.738 "memory_domains": [ 00:23:07.738 { 00:23:07.738 "dma_device_id": "system", 00:23:07.738 "dma_device_type": 1 00:23:07.738 }, 00:23:07.738 { 00:23:07.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:07.738 "dma_device_type": 2 00:23:07.738 } 00:23:07.738 ], 00:23:07.738 "driver_specific": { 00:23:07.738 "passthru": { 00:23:07.738 "name": "pt3", 00:23:07.738 "base_bdev_name": "malloc3" 00:23:07.738 } 00:23:07.738 } 00:23:07.738 }' 00:23:07.738 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:07.738 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:07.995 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:07.995 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:07.996 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:07.996 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:07.996 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.996 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:07.996 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:07.996 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:07.996 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:08.254 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:08.254 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:08.254 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:08.254 22:51:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:08.513 "name": "pt4", 00:23:08.513 "aliases": [ 00:23:08.513 "00000000-0000-0000-0000-000000000004" 00:23:08.513 ], 00:23:08.513 "product_name": "passthru", 00:23:08.513 "block_size": 512, 00:23:08.513 "num_blocks": 65536, 00:23:08.513 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:08.513 "assigned_rate_limits": { 00:23:08.513 "rw_ios_per_sec": 0, 00:23:08.513 "rw_mbytes_per_sec": 0, 00:23:08.513 "r_mbytes_per_sec": 0, 00:23:08.513 "w_mbytes_per_sec": 0 00:23:08.513 }, 00:23:08.513 "claimed": true, 00:23:08.513 "claim_type": "exclusive_write", 00:23:08.513 "zoned": false, 00:23:08.513 "supported_io_types": { 00:23:08.513 "read": true, 00:23:08.513 "write": true, 00:23:08.513 "unmap": true, 00:23:08.513 "flush": true, 00:23:08.513 "reset": true, 00:23:08.513 "nvme_admin": false, 00:23:08.513 "nvme_io": false, 00:23:08.513 "nvme_io_md": false, 00:23:08.513 "write_zeroes": true, 00:23:08.513 "zcopy": true, 00:23:08.513 "get_zone_info": false, 00:23:08.513 "zone_management": false, 00:23:08.513 "zone_append": false, 00:23:08.513 "compare": false, 00:23:08.513 "compare_and_write": false, 00:23:08.513 "abort": true, 00:23:08.513 "seek_hole": false, 00:23:08.513 "seek_data": false, 00:23:08.513 "copy": true, 00:23:08.513 "nvme_iov_md": false 00:23:08.513 }, 00:23:08.513 "memory_domains": [ 00:23:08.513 { 00:23:08.513 "dma_device_id": "system", 00:23:08.513 "dma_device_type": 1 00:23:08.513 }, 00:23:08.513 { 00:23:08.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:08.513 "dma_device_type": 2 00:23:08.513 } 00:23:08.513 ], 00:23:08.513 "driver_specific": { 00:23:08.513 "passthru": { 00:23:08.513 "name": "pt4", 00:23:08.513 "base_bdev_name": "malloc4" 00:23:08.513 } 00:23:08.513 } 00:23:08.513 }' 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:08.513 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:08.771 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:08.771 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:08.771 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:08.771 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:08.771 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:09.029 [2024-07-15 22:51:53.725669] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:09.029 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0816399a-baed-4658-91a2-3c0fc970c6a6 '!=' 0816399a-baed-4658-91a2-3c0fc970c6a6 ']' 00:23:09.029 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:09.029 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:09.029 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:09.029 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:09.379 [2024-07-15 22:51:53.970033] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:09.379 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:09.379 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.380 22:51:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.639 22:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.639 "name": "raid_bdev1", 00:23:09.639 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:09.639 "strip_size_kb": 0, 00:23:09.639 "state": "online", 00:23:09.639 "raid_level": "raid1", 00:23:09.639 "superblock": true, 00:23:09.639 "num_base_bdevs": 4, 00:23:09.639 "num_base_bdevs_discovered": 3, 00:23:09.639 "num_base_bdevs_operational": 3, 00:23:09.639 "base_bdevs_list": [ 00:23:09.639 { 00:23:09.639 "name": null, 00:23:09.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.639 "is_configured": false, 00:23:09.639 "data_offset": 2048, 00:23:09.639 "data_size": 63488 00:23:09.639 }, 00:23:09.639 { 00:23:09.639 "name": "pt2", 00:23:09.639 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:09.639 "is_configured": true, 00:23:09.639 "data_offset": 2048, 00:23:09.639 "data_size": 63488 00:23:09.639 }, 00:23:09.639 { 00:23:09.639 "name": "pt3", 00:23:09.639 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:09.639 "is_configured": true, 00:23:09.639 "data_offset": 2048, 00:23:09.639 "data_size": 63488 00:23:09.639 }, 00:23:09.639 { 00:23:09.639 "name": "pt4", 00:23:09.639 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:09.639 "is_configured": true, 00:23:09.639 "data_offset": 2048, 00:23:09.639 "data_size": 63488 00:23:09.639 } 00:23:09.639 ] 00:23:09.639 }' 00:23:09.639 22:51:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.639 22:51:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:10.207 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:10.466 [2024-07-15 22:51:55.325578] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:10.466 [2024-07-15 22:51:55.325612] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:10.466 [2024-07-15 22:51:55.325674] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:10.466 [2024-07-15 22:51:55.325744] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:10.466 [2024-07-15 22:51:55.325757] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b1780 name raid_bdev1, state offline 00:23:10.466 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.466 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:23:10.725 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:23:10.725 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:23:10.725 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:23:10.725 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:10.725 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:10.985 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:10.985 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:10.985 22:51:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:11.244 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:11.244 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:11.244 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:11.502 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:23:11.502 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:23:11.502 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:23:11.503 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:11.503 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:11.761 [2024-07-15 22:51:56.524696] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:11.761 [2024-07-15 22:51:56.524747] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:11.761 [2024-07-15 22:51:56.524767] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1454700 00:23:11.761 [2024-07-15 22:51:56.524780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:11.761 [2024-07-15 22:51:56.526634] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:11.761 [2024-07-15 22:51:56.526666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:11.761 [2024-07-15 22:51:56.526739] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:11.761 [2024-07-15 22:51:56.526767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:11.761 pt2 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.761 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.020 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.020 "name": "raid_bdev1", 00:23:12.020 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:12.020 "strip_size_kb": 0, 00:23:12.020 "state": "configuring", 00:23:12.020 "raid_level": "raid1", 00:23:12.020 "superblock": true, 00:23:12.020 "num_base_bdevs": 4, 00:23:12.020 "num_base_bdevs_discovered": 1, 00:23:12.020 "num_base_bdevs_operational": 3, 00:23:12.020 "base_bdevs_list": [ 00:23:12.020 { 00:23:12.020 "name": null, 00:23:12.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.020 "is_configured": false, 00:23:12.020 "data_offset": 2048, 00:23:12.020 "data_size": 63488 00:23:12.020 }, 00:23:12.020 { 00:23:12.020 "name": "pt2", 00:23:12.020 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:12.020 "is_configured": true, 00:23:12.021 "data_offset": 2048, 00:23:12.021 "data_size": 63488 00:23:12.021 }, 00:23:12.021 { 00:23:12.021 "name": null, 00:23:12.021 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:12.021 "is_configured": false, 00:23:12.021 "data_offset": 2048, 00:23:12.021 "data_size": 63488 00:23:12.021 }, 00:23:12.021 { 00:23:12.021 "name": null, 00:23:12.021 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:12.021 "is_configured": false, 00:23:12.021 "data_offset": 2048, 00:23:12.021 "data_size": 63488 00:23:12.021 } 00:23:12.021 ] 00:23:12.021 }' 00:23:12.021 22:51:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.021 22:51:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:12.589 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:23:12.589 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:12.590 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:13.159 [2024-07-15 22:51:57.908388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:13.159 [2024-07-15 22:51:57.908442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.159 [2024-07-15 22:51:57.908465] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12baa10 00:23:13.159 [2024-07-15 22:51:57.908478] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.159 [2024-07-15 22:51:57.908873] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.159 [2024-07-15 22:51:57.908891] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:13.159 [2024-07-15 22:51:57.908971] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:13.159 [2024-07-15 22:51:57.908991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:13.159 pt3 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.159 22:51:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.727 22:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.727 "name": "raid_bdev1", 00:23:13.727 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:13.727 "strip_size_kb": 0, 00:23:13.727 "state": "configuring", 00:23:13.727 "raid_level": "raid1", 00:23:13.727 "superblock": true, 00:23:13.727 "num_base_bdevs": 4, 00:23:13.727 "num_base_bdevs_discovered": 2, 00:23:13.727 "num_base_bdevs_operational": 3, 00:23:13.727 "base_bdevs_list": [ 00:23:13.727 { 00:23:13.727 "name": null, 00:23:13.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.727 "is_configured": false, 00:23:13.727 "data_offset": 2048, 00:23:13.727 "data_size": 63488 00:23:13.727 }, 00:23:13.727 { 00:23:13.727 "name": "pt2", 00:23:13.727 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:13.727 "is_configured": true, 00:23:13.727 "data_offset": 2048, 00:23:13.727 "data_size": 63488 00:23:13.727 }, 00:23:13.727 { 00:23:13.727 "name": "pt3", 00:23:13.727 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:13.727 "is_configured": true, 00:23:13.727 "data_offset": 2048, 00:23:13.727 "data_size": 63488 00:23:13.727 }, 00:23:13.727 { 00:23:13.727 "name": null, 00:23:13.727 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:13.727 "is_configured": false, 00:23:13.727 "data_offset": 2048, 00:23:13.727 "data_size": 63488 00:23:13.727 } 00:23:13.727 ] 00:23:13.727 }' 00:23:13.727 22:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.727 22:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:14.294 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:23:14.294 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:14.294 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:23:14.294 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:14.553 [2024-07-15 22:51:59.280040] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:14.553 [2024-07-15 22:51:59.280093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:14.553 [2024-07-15 22:51:59.280112] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x145d520 00:23:14.553 [2024-07-15 22:51:59.280125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:14.553 [2024-07-15 22:51:59.280503] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:14.553 [2024-07-15 22:51:59.280521] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:14.553 [2024-07-15 22:51:59.280590] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:14.553 [2024-07-15 22:51:59.280609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:14.553 [2024-07-15 22:51:59.280725] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b1ea0 00:23:14.553 [2024-07-15 22:51:59.280736] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:14.553 [2024-07-15 22:51:59.280911] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12b6600 00:23:14.553 [2024-07-15 22:51:59.281062] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b1ea0 00:23:14.553 [2024-07-15 22:51:59.281073] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b1ea0 00:23:14.553 [2024-07-15 22:51:59.281178] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:14.553 pt4 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.553 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.121 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.121 "name": "raid_bdev1", 00:23:15.121 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:15.122 "strip_size_kb": 0, 00:23:15.122 "state": "online", 00:23:15.122 "raid_level": "raid1", 00:23:15.122 "superblock": true, 00:23:15.122 "num_base_bdevs": 4, 00:23:15.122 "num_base_bdevs_discovered": 3, 00:23:15.122 "num_base_bdevs_operational": 3, 00:23:15.122 "base_bdevs_list": [ 00:23:15.122 { 00:23:15.122 "name": null, 00:23:15.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.122 "is_configured": false, 00:23:15.122 "data_offset": 2048, 00:23:15.122 "data_size": 63488 00:23:15.122 }, 00:23:15.122 { 00:23:15.122 "name": "pt2", 00:23:15.122 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:15.122 "is_configured": true, 00:23:15.122 "data_offset": 2048, 00:23:15.122 "data_size": 63488 00:23:15.122 }, 00:23:15.122 { 00:23:15.122 "name": "pt3", 00:23:15.122 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:15.122 "is_configured": true, 00:23:15.122 "data_offset": 2048, 00:23:15.122 "data_size": 63488 00:23:15.122 }, 00:23:15.122 { 00:23:15.122 "name": "pt4", 00:23:15.122 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:15.122 "is_configured": true, 00:23:15.122 "data_offset": 2048, 00:23:15.122 "data_size": 63488 00:23:15.122 } 00:23:15.122 ] 00:23:15.122 }' 00:23:15.122 22:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.122 22:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:15.689 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:15.948 [2024-07-15 22:52:00.667735] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:15.948 [2024-07-15 22:52:00.667765] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:15.948 [2024-07-15 22:52:00.667824] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:15.948 [2024-07-15 22:52:00.667895] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:15.948 [2024-07-15 22:52:00.667907] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b1ea0 name raid_bdev1, state offline 00:23:15.948 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.948 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:16.206 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:16.206 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:16.206 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:23:16.206 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:23:16.206 22:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:16.465 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:16.724 [2024-07-15 22:52:01.417676] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:16.724 [2024-07-15 22:52:01.417725] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.724 [2024-07-15 22:52:01.417743] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x145d520 00:23:16.724 [2024-07-15 22:52:01.417756] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.724 [2024-07-15 22:52:01.419600] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.724 [2024-07-15 22:52:01.419629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:16.724 [2024-07-15 22:52:01.419695] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:16.724 [2024-07-15 22:52:01.419723] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:16.724 [2024-07-15 22:52:01.419830] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:16.724 [2024-07-15 22:52:01.419844] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:16.724 [2024-07-15 22:52:01.419858] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b1060 name raid_bdev1, state configuring 00:23:16.724 [2024-07-15 22:52:01.419884] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:16.724 [2024-07-15 22:52:01.419975] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:16.724 pt1 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.724 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.983 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.983 "name": "raid_bdev1", 00:23:16.983 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:16.983 "strip_size_kb": 0, 00:23:16.983 "state": "configuring", 00:23:16.983 "raid_level": "raid1", 00:23:16.983 "superblock": true, 00:23:16.983 "num_base_bdevs": 4, 00:23:16.983 "num_base_bdevs_discovered": 2, 00:23:16.983 "num_base_bdevs_operational": 3, 00:23:16.983 "base_bdevs_list": [ 00:23:16.983 { 00:23:16.983 "name": null, 00:23:16.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:16.983 "is_configured": false, 00:23:16.983 "data_offset": 2048, 00:23:16.983 "data_size": 63488 00:23:16.983 }, 00:23:16.983 { 00:23:16.983 "name": "pt2", 00:23:16.983 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:16.983 "is_configured": true, 00:23:16.983 "data_offset": 2048, 00:23:16.983 "data_size": 63488 00:23:16.983 }, 00:23:16.983 { 00:23:16.983 "name": "pt3", 00:23:16.983 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:16.983 "is_configured": true, 00:23:16.983 "data_offset": 2048, 00:23:16.983 "data_size": 63488 00:23:16.983 }, 00:23:16.983 { 00:23:16.983 "name": null, 00:23:16.983 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:16.983 "is_configured": false, 00:23:16.983 "data_offset": 2048, 00:23:16.983 "data_size": 63488 00:23:16.983 } 00:23:16.983 ] 00:23:16.983 }' 00:23:16.983 22:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.983 22:52:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:17.550 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:23:17.550 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:17.807 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:23:17.807 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:18.066 [2024-07-15 22:52:02.721159] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:18.066 [2024-07-15 22:52:02.721216] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.066 [2024-07-15 22:52:02.721236] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b1310 00:23:18.066 [2024-07-15 22:52:02.721249] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.066 [2024-07-15 22:52:02.721627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.066 [2024-07-15 22:52:02.721645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:18.066 [2024-07-15 22:52:02.721712] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:18.066 [2024-07-15 22:52:02.721732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:18.066 [2024-07-15 22:52:02.721850] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b4b40 00:23:18.066 [2024-07-15 22:52:02.721861] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:18.066 [2024-07-15 22:52:02.722048] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1454990 00:23:18.066 [2024-07-15 22:52:02.722189] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b4b40 00:23:18.066 [2024-07-15 22:52:02.722205] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12b4b40 00:23:18.066 [2024-07-15 22:52:02.722312] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:18.066 pt4 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.066 22:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.324 22:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.324 "name": "raid_bdev1", 00:23:18.324 "uuid": "0816399a-baed-4658-91a2-3c0fc970c6a6", 00:23:18.324 "strip_size_kb": 0, 00:23:18.324 "state": "online", 00:23:18.324 "raid_level": "raid1", 00:23:18.325 "superblock": true, 00:23:18.325 "num_base_bdevs": 4, 00:23:18.325 "num_base_bdevs_discovered": 3, 00:23:18.325 "num_base_bdevs_operational": 3, 00:23:18.325 "base_bdevs_list": [ 00:23:18.325 { 00:23:18.325 "name": null, 00:23:18.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.325 "is_configured": false, 00:23:18.325 "data_offset": 2048, 00:23:18.325 "data_size": 63488 00:23:18.325 }, 00:23:18.325 { 00:23:18.325 "name": "pt2", 00:23:18.325 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:18.325 "is_configured": true, 00:23:18.325 "data_offset": 2048, 00:23:18.325 "data_size": 63488 00:23:18.325 }, 00:23:18.325 { 00:23:18.325 "name": "pt3", 00:23:18.325 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:18.325 "is_configured": true, 00:23:18.325 "data_offset": 2048, 00:23:18.325 "data_size": 63488 00:23:18.325 }, 00:23:18.325 { 00:23:18.325 "name": "pt4", 00:23:18.325 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:18.325 "is_configured": true, 00:23:18.325 "data_offset": 2048, 00:23:18.325 "data_size": 63488 00:23:18.325 } 00:23:18.325 ] 00:23:18.325 }' 00:23:18.325 22:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.325 22:52:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:18.892 22:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:18.892 22:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:19.151 22:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:19.151 22:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:19.151 22:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:19.151 [2024-07-15 22:52:04.032936] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:19.151 22:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 0816399a-baed-4658-91a2-3c0fc970c6a6 '!=' 0816399a-baed-4658-91a2-3c0fc970c6a6 ']' 00:23:19.151 22:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2802023 00:23:19.151 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2802023 ']' 00:23:19.151 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2802023 00:23:19.151 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:23:19.151 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:19.151 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2802023 00:23:19.411 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:19.411 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:19.411 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2802023' 00:23:19.411 killing process with pid 2802023 00:23:19.411 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2802023 00:23:19.411 [2024-07-15 22:52:04.097995] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:19.411 [2024-07-15 22:52:04.098053] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:19.411 [2024-07-15 22:52:04.098129] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:19.411 [2024-07-15 22:52:04.098141] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b4b40 name raid_bdev1, state offline 00:23:19.411 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2802023 00:23:19.411 [2024-07-15 22:52:04.185040] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:19.980 22:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:23:19.980 00:23:19.980 real 0m28.516s 00:23:19.980 user 0m52.136s 00:23:19.980 sys 0m4.968s 00:23:19.980 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:19.980 22:52:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.980 ************************************ 00:23:19.980 END TEST raid_superblock_test 00:23:19.980 ************************************ 00:23:19.980 22:52:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:19.980 22:52:04 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:23:19.980 22:52:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:19.980 22:52:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:19.980 22:52:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:19.980 ************************************ 00:23:19.980 START TEST raid_read_error_test 00:23:19.980 ************************************ 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RAk4r4BZZM 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2806164 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2806164 /var/tmp/spdk-raid.sock 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2806164 ']' 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:19.980 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:19.980 22:52:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.980 [2024-07-15 22:52:04.751770] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:23:19.980 [2024-07-15 22:52:04.751841] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2806164 ] 00:23:19.980 [2024-07-15 22:52:04.881781] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:20.238 [2024-07-15 22:52:04.988464] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:20.238 [2024-07-15 22:52:05.048311] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:20.238 [2024-07-15 22:52:05.048347] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:20.806 22:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:20.806 22:52:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:20.806 22:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:20.806 22:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:21.064 BaseBdev1_malloc 00:23:21.064 22:52:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:21.322 true 00:23:21.322 22:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:21.581 [2024-07-15 22:52:06.422268] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:21.581 [2024-07-15 22:52:06.422312] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:21.581 [2024-07-15 22:52:06.422332] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19530d0 00:23:21.581 [2024-07-15 22:52:06.422350] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:21.581 [2024-07-15 22:52:06.424124] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:21.582 [2024-07-15 22:52:06.424155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:21.582 BaseBdev1 00:23:21.582 22:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:21.582 22:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:21.840 BaseBdev2_malloc 00:23:21.840 22:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:22.106 true 00:23:22.106 22:52:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:22.371 [2024-07-15 22:52:07.196918] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:22.371 [2024-07-15 22:52:07.196968] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:22.371 [2024-07-15 22:52:07.196989] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1957910 00:23:22.371 [2024-07-15 22:52:07.197001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:22.371 [2024-07-15 22:52:07.198484] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:22.372 [2024-07-15 22:52:07.198513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:22.372 BaseBdev2 00:23:22.372 22:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:22.372 22:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:22.629 BaseBdev3_malloc 00:23:22.629 22:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:22.887 true 00:23:22.887 22:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:23.144 [2024-07-15 22:52:07.967619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:23.145 [2024-07-15 22:52:07.967664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.145 [2024-07-15 22:52:07.967683] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1959bd0 00:23:23.145 [2024-07-15 22:52:07.967696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.145 [2024-07-15 22:52:07.969096] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.145 [2024-07-15 22:52:07.969125] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:23.145 BaseBdev3 00:23:23.145 22:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:23.145 22:52:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:23.402 BaseBdev4_malloc 00:23:23.403 22:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:23.672 true 00:23:23.672 22:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:23.949 [2024-07-15 22:52:08.710125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:23.949 [2024-07-15 22:52:08.710170] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.949 [2024-07-15 22:52:08.710197] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x195aaa0 00:23:23.949 [2024-07-15 22:52:08.710211] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.949 [2024-07-15 22:52:08.711737] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.949 [2024-07-15 22:52:08.711767] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:23.949 BaseBdev4 00:23:23.950 22:52:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:24.208 [2024-07-15 22:52:08.970852] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:24.208 [2024-07-15 22:52:08.972165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:24.208 [2024-07-15 22:52:08.972235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:24.208 [2024-07-15 22:52:08.972296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:24.208 [2024-07-15 22:52:08.972533] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1954c20 00:23:24.208 [2024-07-15 22:52:08.972544] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:24.208 [2024-07-15 22:52:08.972746] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17a9260 00:23:24.208 [2024-07-15 22:52:08.972909] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1954c20 00:23:24.208 [2024-07-15 22:52:08.972919] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1954c20 00:23:24.208 [2024-07-15 22:52:08.973036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.208 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.467 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:24.467 "name": "raid_bdev1", 00:23:24.467 "uuid": "0e259370-a6ac-4543-8bb9-4e4d607df812", 00:23:24.467 "strip_size_kb": 0, 00:23:24.467 "state": "online", 00:23:24.467 "raid_level": "raid1", 00:23:24.467 "superblock": true, 00:23:24.467 "num_base_bdevs": 4, 00:23:24.467 "num_base_bdevs_discovered": 4, 00:23:24.467 "num_base_bdevs_operational": 4, 00:23:24.467 "base_bdevs_list": [ 00:23:24.467 { 00:23:24.467 "name": "BaseBdev1", 00:23:24.467 "uuid": "0cadc670-55a9-506c-9a46-82c559c26900", 00:23:24.467 "is_configured": true, 00:23:24.467 "data_offset": 2048, 00:23:24.467 "data_size": 63488 00:23:24.467 }, 00:23:24.467 { 00:23:24.467 "name": "BaseBdev2", 00:23:24.467 "uuid": "47682d18-7e16-58c4-82ef-454f737ae6a2", 00:23:24.467 "is_configured": true, 00:23:24.467 "data_offset": 2048, 00:23:24.467 "data_size": 63488 00:23:24.467 }, 00:23:24.467 { 00:23:24.467 "name": "BaseBdev3", 00:23:24.467 "uuid": "a76da420-6d05-5a36-841e-5fb089524b99", 00:23:24.467 "is_configured": true, 00:23:24.467 "data_offset": 2048, 00:23:24.467 "data_size": 63488 00:23:24.467 }, 00:23:24.467 { 00:23:24.467 "name": "BaseBdev4", 00:23:24.467 "uuid": "84bdeec8-72ac-57e1-a970-c07a2e286307", 00:23:24.467 "is_configured": true, 00:23:24.467 "data_offset": 2048, 00:23:24.467 "data_size": 63488 00:23:24.467 } 00:23:24.467 ] 00:23:24.467 }' 00:23:24.467 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:24.467 22:52:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:25.034 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:25.034 22:52:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:25.292 [2024-07-15 22:52:09.997841] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17a8c60 00:23:26.224 22:52:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.483 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:26.741 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.741 "name": "raid_bdev1", 00:23:26.741 "uuid": "0e259370-a6ac-4543-8bb9-4e4d607df812", 00:23:26.741 "strip_size_kb": 0, 00:23:26.741 "state": "online", 00:23:26.741 "raid_level": "raid1", 00:23:26.741 "superblock": true, 00:23:26.741 "num_base_bdevs": 4, 00:23:26.741 "num_base_bdevs_discovered": 4, 00:23:26.741 "num_base_bdevs_operational": 4, 00:23:26.741 "base_bdevs_list": [ 00:23:26.741 { 00:23:26.741 "name": "BaseBdev1", 00:23:26.741 "uuid": "0cadc670-55a9-506c-9a46-82c559c26900", 00:23:26.741 "is_configured": true, 00:23:26.741 "data_offset": 2048, 00:23:26.741 "data_size": 63488 00:23:26.741 }, 00:23:26.741 { 00:23:26.741 "name": "BaseBdev2", 00:23:26.741 "uuid": "47682d18-7e16-58c4-82ef-454f737ae6a2", 00:23:26.741 "is_configured": true, 00:23:26.741 "data_offset": 2048, 00:23:26.741 "data_size": 63488 00:23:26.741 }, 00:23:26.741 { 00:23:26.741 "name": "BaseBdev3", 00:23:26.741 "uuid": "a76da420-6d05-5a36-841e-5fb089524b99", 00:23:26.741 "is_configured": true, 00:23:26.741 "data_offset": 2048, 00:23:26.741 "data_size": 63488 00:23:26.741 }, 00:23:26.741 { 00:23:26.741 "name": "BaseBdev4", 00:23:26.741 "uuid": "84bdeec8-72ac-57e1-a970-c07a2e286307", 00:23:26.741 "is_configured": true, 00:23:26.741 "data_offset": 2048, 00:23:26.741 "data_size": 63488 00:23:26.741 } 00:23:26.741 ] 00:23:26.741 }' 00:23:26.741 22:52:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.741 22:52:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:27.307 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:27.564 [2024-07-15 22:52:12.238986] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:27.564 [2024-07-15 22:52:12.239033] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:27.564 [2024-07-15 22:52:12.242319] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:27.564 [2024-07-15 22:52:12.242357] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:27.564 [2024-07-15 22:52:12.242478] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:27.564 [2024-07-15 22:52:12.242490] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1954c20 name raid_bdev1, state offline 00:23:27.564 0 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2806164 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2806164 ']' 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2806164 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2806164 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2806164' 00:23:27.564 killing process with pid 2806164 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2806164 00:23:27.564 [2024-07-15 22:52:12.323298] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:27.564 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2806164 00:23:27.564 [2024-07-15 22:52:12.355026] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RAk4r4BZZM 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:27.823 00:23:27.823 real 0m7.929s 00:23:27.823 user 0m12.751s 00:23:27.823 sys 0m1.390s 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:27.823 22:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:27.823 ************************************ 00:23:27.823 END TEST raid_read_error_test 00:23:27.823 ************************************ 00:23:27.823 22:52:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:27.823 22:52:12 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:23:27.823 22:52:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:27.823 22:52:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:27.823 22:52:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:27.823 ************************************ 00:23:27.823 START TEST raid_write_error_test 00:23:27.823 ************************************ 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6biBPSroDI 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2807224 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2807224 /var/tmp/spdk-raid.sock 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2807224 ']' 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:27.823 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:27.823 22:52:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:28.081 [2024-07-15 22:52:12.762382] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:23:28.081 [2024-07-15 22:52:12.762453] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2807224 ] 00:23:28.081 [2024-07-15 22:52:12.882739] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:28.339 [2024-07-15 22:52:12.995416] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:28.339 [2024-07-15 22:52:13.060408] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:28.339 [2024-07-15 22:52:13.060439] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:28.904 22:52:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:28.904 22:52:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:28.904 22:52:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:28.904 22:52:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:29.162 BaseBdev1_malloc 00:23:29.162 22:52:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:29.162 true 00:23:29.421 22:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:29.421 [2024-07-15 22:52:14.313948] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:29.421 [2024-07-15 22:52:14.313991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.421 [2024-07-15 22:52:14.314010] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19320d0 00:23:29.421 [2024-07-15 22:52:14.314023] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.421 [2024-07-15 22:52:14.315907] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.421 [2024-07-15 22:52:14.315947] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:29.421 BaseBdev1 00:23:29.680 22:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:29.680 22:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:29.680 BaseBdev2_malloc 00:23:29.938 22:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:29.938 true 00:23:29.938 22:52:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:30.197 [2024-07-15 22:52:15.029627] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:30.197 [2024-07-15 22:52:15.029672] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.197 [2024-07-15 22:52:15.029693] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1936910 00:23:30.197 [2024-07-15 22:52:15.029705] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.197 [2024-07-15 22:52:15.031284] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.197 [2024-07-15 22:52:15.031314] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:30.197 BaseBdev2 00:23:30.197 22:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:30.197 22:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:30.456 BaseBdev3_malloc 00:23:30.456 22:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:30.713 true 00:23:30.713 22:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:30.971 [2024-07-15 22:52:15.756633] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:30.971 [2024-07-15 22:52:15.756677] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:30.971 [2024-07-15 22:52:15.756698] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1938bd0 00:23:30.971 [2024-07-15 22:52:15.756716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:30.971 [2024-07-15 22:52:15.758250] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:30.971 [2024-07-15 22:52:15.758280] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:30.971 BaseBdev3 00:23:30.971 22:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:30.971 22:52:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:31.230 BaseBdev4_malloc 00:23:31.230 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:31.489 true 00:23:31.489 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:31.748 [2024-07-15 22:52:16.491132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:31.748 [2024-07-15 22:52:16.491176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:31.748 [2024-07-15 22:52:16.491197] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1939aa0 00:23:31.748 [2024-07-15 22:52:16.491209] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:31.748 [2024-07-15 22:52:16.492807] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:31.748 [2024-07-15 22:52:16.492837] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:31.748 BaseBdev4 00:23:31.748 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:32.007 [2024-07-15 22:52:16.727788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:32.007 [2024-07-15 22:52:16.729199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:32.007 [2024-07-15 22:52:16.729268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:32.007 [2024-07-15 22:52:16.729329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:32.007 [2024-07-15 22:52:16.729572] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1933c20 00:23:32.007 [2024-07-15 22:52:16.729583] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:32.007 [2024-07-15 22:52:16.729786] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1788260 00:23:32.007 [2024-07-15 22:52:16.729953] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1933c20 00:23:32.007 [2024-07-15 22:52:16.729964] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1933c20 00:23:32.007 [2024-07-15 22:52:16.730075] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.007 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.266 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.266 "name": "raid_bdev1", 00:23:32.266 "uuid": "b9df8007-5d72-4f49-babe-96b1ddf3d759", 00:23:32.266 "strip_size_kb": 0, 00:23:32.266 "state": "online", 00:23:32.266 "raid_level": "raid1", 00:23:32.266 "superblock": true, 00:23:32.266 "num_base_bdevs": 4, 00:23:32.266 "num_base_bdevs_discovered": 4, 00:23:32.266 "num_base_bdevs_operational": 4, 00:23:32.266 "base_bdevs_list": [ 00:23:32.266 { 00:23:32.266 "name": "BaseBdev1", 00:23:32.266 "uuid": "f1515704-d91e-54b3-9b83-ecf09eaaaa8f", 00:23:32.266 "is_configured": true, 00:23:32.266 "data_offset": 2048, 00:23:32.266 "data_size": 63488 00:23:32.266 }, 00:23:32.266 { 00:23:32.266 "name": "BaseBdev2", 00:23:32.266 "uuid": "e70f305a-7700-5eeb-8b87-56ab4dcc2e08", 00:23:32.266 "is_configured": true, 00:23:32.266 "data_offset": 2048, 00:23:32.266 "data_size": 63488 00:23:32.266 }, 00:23:32.266 { 00:23:32.266 "name": "BaseBdev3", 00:23:32.266 "uuid": "2c891460-29cd-56c5-82ef-2c6c44841da5", 00:23:32.266 "is_configured": true, 00:23:32.266 "data_offset": 2048, 00:23:32.266 "data_size": 63488 00:23:32.266 }, 00:23:32.266 { 00:23:32.266 "name": "BaseBdev4", 00:23:32.266 "uuid": "5fcf546a-7165-521d-87a2-d547c7b0b7a7", 00:23:32.266 "is_configured": true, 00:23:32.266 "data_offset": 2048, 00:23:32.266 "data_size": 63488 00:23:32.266 } 00:23:32.266 ] 00:23:32.266 }' 00:23:32.266 22:52:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.266 22:52:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:32.835 22:52:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:32.835 22:52:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:32.835 [2024-07-15 22:52:17.710686] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1787c60 00:23:33.771 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:34.031 [2024-07-15 22:52:18.835436] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:23:34.031 [2024-07-15 22:52:18.835498] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:34.031 [2024-07-15 22:52:18.835717] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1787c60 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.031 22:52:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.291 22:52:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.291 "name": "raid_bdev1", 00:23:34.291 "uuid": "b9df8007-5d72-4f49-babe-96b1ddf3d759", 00:23:34.291 "strip_size_kb": 0, 00:23:34.291 "state": "online", 00:23:34.291 "raid_level": "raid1", 00:23:34.291 "superblock": true, 00:23:34.291 "num_base_bdevs": 4, 00:23:34.291 "num_base_bdevs_discovered": 3, 00:23:34.291 "num_base_bdevs_operational": 3, 00:23:34.291 "base_bdevs_list": [ 00:23:34.291 { 00:23:34.291 "name": null, 00:23:34.291 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.291 "is_configured": false, 00:23:34.291 "data_offset": 2048, 00:23:34.291 "data_size": 63488 00:23:34.291 }, 00:23:34.291 { 00:23:34.291 "name": "BaseBdev2", 00:23:34.291 "uuid": "e70f305a-7700-5eeb-8b87-56ab4dcc2e08", 00:23:34.291 "is_configured": true, 00:23:34.291 "data_offset": 2048, 00:23:34.291 "data_size": 63488 00:23:34.291 }, 00:23:34.291 { 00:23:34.291 "name": "BaseBdev3", 00:23:34.291 "uuid": "2c891460-29cd-56c5-82ef-2c6c44841da5", 00:23:34.291 "is_configured": true, 00:23:34.291 "data_offset": 2048, 00:23:34.291 "data_size": 63488 00:23:34.291 }, 00:23:34.291 { 00:23:34.291 "name": "BaseBdev4", 00:23:34.291 "uuid": "5fcf546a-7165-521d-87a2-d547c7b0b7a7", 00:23:34.291 "is_configured": true, 00:23:34.291 "data_offset": 2048, 00:23:34.291 "data_size": 63488 00:23:34.291 } 00:23:34.291 ] 00:23:34.291 }' 00:23:34.291 22:52:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.291 22:52:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:34.859 22:52:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:35.118 [2024-07-15 22:52:19.947349] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:35.118 [2024-07-15 22:52:19.947382] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:35.118 [2024-07-15 22:52:19.950531] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:35.118 [2024-07-15 22:52:19.950567] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:35.118 [2024-07-15 22:52:19.950664] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:35.118 [2024-07-15 22:52:19.950676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1933c20 name raid_bdev1, state offline 00:23:35.118 0 00:23:35.118 22:52:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2807224 00:23:35.118 22:52:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2807224 ']' 00:23:35.118 22:52:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2807224 00:23:35.118 22:52:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:23:35.118 22:52:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:35.118 22:52:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2807224 00:23:35.118 22:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:35.118 22:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:35.118 22:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2807224' 00:23:35.118 killing process with pid 2807224 00:23:35.118 22:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2807224 00:23:35.118 [2024-07-15 22:52:20.015878] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:35.118 22:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2807224 00:23:35.378 [2024-07-15 22:52:20.046768] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6biBPSroDI 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:35.378 00:23:35.378 real 0m7.590s 00:23:35.378 user 0m12.166s 00:23:35.378 sys 0m1.321s 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:35.378 22:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:35.378 ************************************ 00:23:35.378 END TEST raid_write_error_test 00:23:35.378 ************************************ 00:23:35.638 22:52:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:35.638 22:52:20 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:23:35.638 22:52:20 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:35.638 22:52:20 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:23:35.638 22:52:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:35.638 22:52:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:35.638 22:52:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:35.638 ************************************ 00:23:35.638 START TEST raid_rebuild_test 00:23:35.638 ************************************ 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2808371 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2808371 /var/tmp/spdk-raid.sock 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2808371 ']' 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:35.638 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:35.638 22:52:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:35.638 [2024-07-15 22:52:20.432567] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:23:35.638 [2024-07-15 22:52:20.432636] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2808371 ] 00:23:35.638 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:35.638 Zero copy mechanism will not be used. 00:23:35.897 [2024-07-15 22:52:20.560395] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:35.897 [2024-07-15 22:52:20.664624] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:35.897 [2024-07-15 22:52:20.728789] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:35.897 [2024-07-15 22:52:20.728830] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:36.465 22:52:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:36.465 22:52:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:23:36.465 22:52:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:36.465 22:52:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:36.724 BaseBdev1_malloc 00:23:36.724 22:52:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:36.983 [2024-07-15 22:52:21.782231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:36.983 [2024-07-15 22:52:21.782282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:36.983 [2024-07-15 22:52:21.782305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2027d40 00:23:36.983 [2024-07-15 22:52:21.782318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:36.983 [2024-07-15 22:52:21.783973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:36.983 [2024-07-15 22:52:21.784005] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:36.983 BaseBdev1 00:23:36.983 22:52:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:36.983 22:52:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:37.242 BaseBdev2_malloc 00:23:37.242 22:52:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:37.542 [2024-07-15 22:52:22.272328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:37.542 [2024-07-15 22:52:22.272375] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:37.542 [2024-07-15 22:52:22.272396] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2028860 00:23:37.542 [2024-07-15 22:52:22.272409] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:37.542 [2024-07-15 22:52:22.273804] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:37.542 [2024-07-15 22:52:22.273833] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:37.542 BaseBdev2 00:23:37.542 22:52:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:37.810 spare_malloc 00:23:37.810 22:52:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:38.068 spare_delay 00:23:38.068 22:52:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:38.327 [2024-07-15 22:52:23.022948] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:38.327 [2024-07-15 22:52:23.022994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:38.327 [2024-07-15 22:52:23.023014] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21d6ec0 00:23:38.327 [2024-07-15 22:52:23.023027] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:38.327 [2024-07-15 22:52:23.024473] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:38.327 [2024-07-15 22:52:23.024502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:38.327 spare 00:23:38.327 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:38.586 [2024-07-15 22:52:23.271609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:38.586 [2024-07-15 22:52:23.272799] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:38.586 [2024-07-15 22:52:23.272873] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21d8070 00:23:38.586 [2024-07-15 22:52:23.272884] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:38.586 [2024-07-15 22:52:23.273084] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d1490 00:23:38.586 [2024-07-15 22:52:23.273218] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21d8070 00:23:38.586 [2024-07-15 22:52:23.273228] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21d8070 00:23:38.586 [2024-07-15 22:52:23.273334] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.586 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.843 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:38.843 "name": "raid_bdev1", 00:23:38.843 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:38.843 "strip_size_kb": 0, 00:23:38.843 "state": "online", 00:23:38.843 "raid_level": "raid1", 00:23:38.844 "superblock": false, 00:23:38.844 "num_base_bdevs": 2, 00:23:38.844 "num_base_bdevs_discovered": 2, 00:23:38.844 "num_base_bdevs_operational": 2, 00:23:38.844 "base_bdevs_list": [ 00:23:38.844 { 00:23:38.844 "name": "BaseBdev1", 00:23:38.844 "uuid": "6b24e4f3-b710-51fd-8f47-8c151a4f9532", 00:23:38.844 "is_configured": true, 00:23:38.844 "data_offset": 0, 00:23:38.844 "data_size": 65536 00:23:38.844 }, 00:23:38.844 { 00:23:38.844 "name": "BaseBdev2", 00:23:38.844 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:38.844 "is_configured": true, 00:23:38.844 "data_offset": 0, 00:23:38.844 "data_size": 65536 00:23:38.844 } 00:23:38.844 ] 00:23:38.844 }' 00:23:38.844 22:52:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:38.844 22:52:23 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:39.410 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:39.410 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:39.669 [2024-07-15 22:52:24.386884] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:39.669 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:39.669 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.669 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:39.928 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:40.187 [2024-07-15 22:52:24.900044] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d1490 00:23:40.187 /dev/nbd0 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:40.187 1+0 records in 00:23:40.187 1+0 records out 00:23:40.187 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281353 s, 14.6 MB/s 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:40.187 22:52:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:46.749 65536+0 records in 00:23:46.749 65536+0 records out 00:23:46.749 33554432 bytes (34 MB, 32 MiB) copied, 5.85305 s, 5.7 MB/s 00:23:46.749 22:52:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:46.749 22:52:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:46.749 22:52:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:46.749 22:52:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:46.749 22:52:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:46.749 22:52:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:46.749 22:52:30 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:46.750 [2024-07-15 22:52:31.092624] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:46.750 [2024-07-15 22:52:31.333312] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.750 "name": "raid_bdev1", 00:23:46.750 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:46.750 "strip_size_kb": 0, 00:23:46.750 "state": "online", 00:23:46.750 "raid_level": "raid1", 00:23:46.750 "superblock": false, 00:23:46.750 "num_base_bdevs": 2, 00:23:46.750 "num_base_bdevs_discovered": 1, 00:23:46.750 "num_base_bdevs_operational": 1, 00:23:46.750 "base_bdevs_list": [ 00:23:46.750 { 00:23:46.750 "name": null, 00:23:46.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:46.750 "is_configured": false, 00:23:46.750 "data_offset": 0, 00:23:46.750 "data_size": 65536 00:23:46.750 }, 00:23:46.750 { 00:23:46.750 "name": "BaseBdev2", 00:23:46.750 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:46.750 "is_configured": true, 00:23:46.750 "data_offset": 0, 00:23:46.750 "data_size": 65536 00:23:46.750 } 00:23:46.750 ] 00:23:46.750 }' 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.750 22:52:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:47.317 22:52:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:47.575 [2024-07-15 22:52:32.335979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:47.575 [2024-07-15 22:52:32.340937] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d8880 00:23:47.575 [2024-07-15 22:52:32.343157] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:47.575 22:52:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:48.509 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:48.509 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:48.509 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:48.509 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:48.509 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:48.509 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.509 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.766 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.766 "name": "raid_bdev1", 00:23:48.766 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:48.766 "strip_size_kb": 0, 00:23:48.766 "state": "online", 00:23:48.766 "raid_level": "raid1", 00:23:48.766 "superblock": false, 00:23:48.766 "num_base_bdevs": 2, 00:23:48.766 "num_base_bdevs_discovered": 2, 00:23:48.766 "num_base_bdevs_operational": 2, 00:23:48.766 "process": { 00:23:48.766 "type": "rebuild", 00:23:48.766 "target": "spare", 00:23:48.766 "progress": { 00:23:48.766 "blocks": 24576, 00:23:48.766 "percent": 37 00:23:48.766 } 00:23:48.766 }, 00:23:48.766 "base_bdevs_list": [ 00:23:48.766 { 00:23:48.766 "name": "spare", 00:23:48.766 "uuid": "040983f1-93e9-5d39-a0d5-b4a2b36acac4", 00:23:48.766 "is_configured": true, 00:23:48.766 "data_offset": 0, 00:23:48.766 "data_size": 65536 00:23:48.766 }, 00:23:48.766 { 00:23:48.766 "name": "BaseBdev2", 00:23:48.766 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:48.766 "is_configured": true, 00:23:48.766 "data_offset": 0, 00:23:48.766 "data_size": 65536 00:23:48.766 } 00:23:48.766 ] 00:23:48.766 }' 00:23:48.766 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.766 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:48.766 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.024 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:49.024 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:49.024 [2024-07-15 22:52:33.929405] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:49.282 [2024-07-15 22:52:33.955686] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:49.282 [2024-07-15 22:52:33.955731] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:49.282 [2024-07-15 22:52:33.955746] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:49.282 [2024-07-15 22:52:33.955755] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.282 22:52:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.540 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.540 "name": "raid_bdev1", 00:23:49.540 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:49.540 "strip_size_kb": 0, 00:23:49.540 "state": "online", 00:23:49.540 "raid_level": "raid1", 00:23:49.540 "superblock": false, 00:23:49.540 "num_base_bdevs": 2, 00:23:49.540 "num_base_bdevs_discovered": 1, 00:23:49.540 "num_base_bdevs_operational": 1, 00:23:49.540 "base_bdevs_list": [ 00:23:49.540 { 00:23:49.540 "name": null, 00:23:49.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.540 "is_configured": false, 00:23:49.540 "data_offset": 0, 00:23:49.540 "data_size": 65536 00:23:49.540 }, 00:23:49.540 { 00:23:49.540 "name": "BaseBdev2", 00:23:49.540 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:49.540 "is_configured": true, 00:23:49.540 "data_offset": 0, 00:23:49.540 "data_size": 65536 00:23:49.540 } 00:23:49.540 ] 00:23:49.540 }' 00:23:49.540 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.540 22:52:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:50.105 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:50.105 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.105 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:50.105 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:50.105 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.105 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.105 22:52:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.364 22:52:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.364 "name": "raid_bdev1", 00:23:50.364 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:50.364 "strip_size_kb": 0, 00:23:50.364 "state": "online", 00:23:50.364 "raid_level": "raid1", 00:23:50.364 "superblock": false, 00:23:50.364 "num_base_bdevs": 2, 00:23:50.364 "num_base_bdevs_discovered": 1, 00:23:50.364 "num_base_bdevs_operational": 1, 00:23:50.364 "base_bdevs_list": [ 00:23:50.364 { 00:23:50.364 "name": null, 00:23:50.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.364 "is_configured": false, 00:23:50.364 "data_offset": 0, 00:23:50.364 "data_size": 65536 00:23:50.364 }, 00:23:50.364 { 00:23:50.364 "name": "BaseBdev2", 00:23:50.364 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:50.364 "is_configured": true, 00:23:50.364 "data_offset": 0, 00:23:50.364 "data_size": 65536 00:23:50.364 } 00:23:50.364 ] 00:23:50.364 }' 00:23:50.364 22:52:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.364 22:52:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:50.364 22:52:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.364 22:52:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:50.364 22:52:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:50.621 [2024-07-15 22:52:35.395989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:50.621 [2024-07-15 22:52:35.400917] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d1490 00:23:50.621 [2024-07-15 22:52:35.402383] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:50.621 22:52:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:51.556 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.556 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.556 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.556 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.556 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.556 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.556 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.816 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:51.816 "name": "raid_bdev1", 00:23:51.816 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:51.816 "strip_size_kb": 0, 00:23:51.816 "state": "online", 00:23:51.816 "raid_level": "raid1", 00:23:51.816 "superblock": false, 00:23:51.816 "num_base_bdevs": 2, 00:23:51.816 "num_base_bdevs_discovered": 2, 00:23:51.816 "num_base_bdevs_operational": 2, 00:23:51.816 "process": { 00:23:51.816 "type": "rebuild", 00:23:51.816 "target": "spare", 00:23:51.816 "progress": { 00:23:51.816 "blocks": 24576, 00:23:51.816 "percent": 37 00:23:51.816 } 00:23:51.816 }, 00:23:51.816 "base_bdevs_list": [ 00:23:51.816 { 00:23:51.816 "name": "spare", 00:23:51.816 "uuid": "040983f1-93e9-5d39-a0d5-b4a2b36acac4", 00:23:51.816 "is_configured": true, 00:23:51.816 "data_offset": 0, 00:23:51.816 "data_size": 65536 00:23:51.816 }, 00:23:51.816 { 00:23:51.816 "name": "BaseBdev2", 00:23:51.816 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:51.816 "is_configured": true, 00:23:51.816 "data_offset": 0, 00:23:51.816 "data_size": 65536 00:23:51.816 } 00:23:51.816 ] 00:23:51.816 }' 00:23:51.816 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:51.816 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:51.816 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=804 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.074 22:52:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.332 22:52:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.332 "name": "raid_bdev1", 00:23:52.332 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:52.332 "strip_size_kb": 0, 00:23:52.332 "state": "online", 00:23:52.332 "raid_level": "raid1", 00:23:52.332 "superblock": false, 00:23:52.332 "num_base_bdevs": 2, 00:23:52.332 "num_base_bdevs_discovered": 2, 00:23:52.332 "num_base_bdevs_operational": 2, 00:23:52.332 "process": { 00:23:52.332 "type": "rebuild", 00:23:52.332 "target": "spare", 00:23:52.332 "progress": { 00:23:52.332 "blocks": 30720, 00:23:52.332 "percent": 46 00:23:52.332 } 00:23:52.332 }, 00:23:52.332 "base_bdevs_list": [ 00:23:52.332 { 00:23:52.332 "name": "spare", 00:23:52.332 "uuid": "040983f1-93e9-5d39-a0d5-b4a2b36acac4", 00:23:52.332 "is_configured": true, 00:23:52.332 "data_offset": 0, 00:23:52.332 "data_size": 65536 00:23:52.332 }, 00:23:52.332 { 00:23:52.332 "name": "BaseBdev2", 00:23:52.332 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:52.332 "is_configured": true, 00:23:52.332 "data_offset": 0, 00:23:52.332 "data_size": 65536 00:23:52.332 } 00:23:52.332 ] 00:23:52.332 }' 00:23:52.332 22:52:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.332 22:52:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:52.332 22:52:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.332 22:52:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:52.332 22:52:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.267 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.525 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.525 "name": "raid_bdev1", 00:23:53.525 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:53.525 "strip_size_kb": 0, 00:23:53.525 "state": "online", 00:23:53.525 "raid_level": "raid1", 00:23:53.525 "superblock": false, 00:23:53.525 "num_base_bdevs": 2, 00:23:53.525 "num_base_bdevs_discovered": 2, 00:23:53.525 "num_base_bdevs_operational": 2, 00:23:53.525 "process": { 00:23:53.525 "type": "rebuild", 00:23:53.525 "target": "spare", 00:23:53.525 "progress": { 00:23:53.525 "blocks": 59392, 00:23:53.525 "percent": 90 00:23:53.525 } 00:23:53.525 }, 00:23:53.525 "base_bdevs_list": [ 00:23:53.525 { 00:23:53.525 "name": "spare", 00:23:53.525 "uuid": "040983f1-93e9-5d39-a0d5-b4a2b36acac4", 00:23:53.525 "is_configured": true, 00:23:53.525 "data_offset": 0, 00:23:53.525 "data_size": 65536 00:23:53.525 }, 00:23:53.525 { 00:23:53.525 "name": "BaseBdev2", 00:23:53.525 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:53.525 "is_configured": true, 00:23:53.525 "data_offset": 0, 00:23:53.525 "data_size": 65536 00:23:53.525 } 00:23:53.525 ] 00:23:53.525 }' 00:23:53.525 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.525 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:53.525 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.783 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:53.783 22:52:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:53.783 [2024-07-15 22:52:38.626970] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:53.783 [2024-07-15 22:52:38.627034] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:53.783 [2024-07-15 22:52:38.627069] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.720 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.979 "name": "raid_bdev1", 00:23:54.979 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:54.979 "strip_size_kb": 0, 00:23:54.979 "state": "online", 00:23:54.979 "raid_level": "raid1", 00:23:54.979 "superblock": false, 00:23:54.979 "num_base_bdevs": 2, 00:23:54.979 "num_base_bdevs_discovered": 2, 00:23:54.979 "num_base_bdevs_operational": 2, 00:23:54.979 "base_bdevs_list": [ 00:23:54.979 { 00:23:54.979 "name": "spare", 00:23:54.979 "uuid": "040983f1-93e9-5d39-a0d5-b4a2b36acac4", 00:23:54.979 "is_configured": true, 00:23:54.979 "data_offset": 0, 00:23:54.979 "data_size": 65536 00:23:54.979 }, 00:23:54.979 { 00:23:54.979 "name": "BaseBdev2", 00:23:54.979 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:54.979 "is_configured": true, 00:23:54.979 "data_offset": 0, 00:23:54.979 "data_size": 65536 00:23:54.979 } 00:23:54.979 ] 00:23:54.979 }' 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.979 22:52:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.238 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:55.238 "name": "raid_bdev1", 00:23:55.238 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:55.238 "strip_size_kb": 0, 00:23:55.238 "state": "online", 00:23:55.238 "raid_level": "raid1", 00:23:55.238 "superblock": false, 00:23:55.238 "num_base_bdevs": 2, 00:23:55.238 "num_base_bdevs_discovered": 2, 00:23:55.238 "num_base_bdevs_operational": 2, 00:23:55.238 "base_bdevs_list": [ 00:23:55.238 { 00:23:55.238 "name": "spare", 00:23:55.238 "uuid": "040983f1-93e9-5d39-a0d5-b4a2b36acac4", 00:23:55.238 "is_configured": true, 00:23:55.238 "data_offset": 0, 00:23:55.238 "data_size": 65536 00:23:55.238 }, 00:23:55.238 { 00:23:55.238 "name": "BaseBdev2", 00:23:55.238 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:55.238 "is_configured": true, 00:23:55.238 "data_offset": 0, 00:23:55.238 "data_size": 65536 00:23:55.238 } 00:23:55.238 ] 00:23:55.238 }' 00:23:55.238 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:55.238 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:55.238 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.496 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.755 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.755 "name": "raid_bdev1", 00:23:55.755 "uuid": "01191b7d-a08e-4d48-a194-2b1e8928c99e", 00:23:55.755 "strip_size_kb": 0, 00:23:55.755 "state": "online", 00:23:55.755 "raid_level": "raid1", 00:23:55.755 "superblock": false, 00:23:55.755 "num_base_bdevs": 2, 00:23:55.755 "num_base_bdevs_discovered": 2, 00:23:55.755 "num_base_bdevs_operational": 2, 00:23:55.755 "base_bdevs_list": [ 00:23:55.755 { 00:23:55.755 "name": "spare", 00:23:55.755 "uuid": "040983f1-93e9-5d39-a0d5-b4a2b36acac4", 00:23:55.755 "is_configured": true, 00:23:55.755 "data_offset": 0, 00:23:55.755 "data_size": 65536 00:23:55.755 }, 00:23:55.755 { 00:23:55.755 "name": "BaseBdev2", 00:23:55.755 "uuid": "8a415ae5-53ed-5fe3-a691-48b8d021a9af", 00:23:55.755 "is_configured": true, 00:23:55.755 "data_offset": 0, 00:23:55.755 "data_size": 65536 00:23:55.755 } 00:23:55.755 ] 00:23:55.755 }' 00:23:55.755 22:52:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.755 22:52:40 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:56.355 22:52:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:56.615 [2024-07-15 22:52:41.251084] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:56.615 [2024-07-15 22:52:41.251113] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:56.615 [2024-07-15 22:52:41.251177] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:56.615 [2024-07-15 22:52:41.251234] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:56.615 [2024-07-15 22:52:41.251246] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d8070 name raid_bdev1, state offline 00:23:56.615 22:52:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.615 22:52:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:56.874 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:56.874 /dev/nbd0 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:57.133 1+0 records in 00:23:57.133 1+0 records out 00:23:57.133 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237306 s, 17.3 MB/s 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:57.133 22:52:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:57.392 /dev/nbd1 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:57.392 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:57.392 1+0 records in 00:23:57.392 1+0 records out 00:23:57.392 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000332836 s, 12.3 MB/s 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:57.393 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:57.650 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2808371 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2808371 ']' 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2808371 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2808371 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:57.908 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2808371' 00:23:57.909 killing process with pid 2808371 00:23:57.909 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2808371 00:23:57.909 Received shutdown signal, test time was about 60.000000 seconds 00:23:57.909 00:23:57.909 Latency(us) 00:23:57.909 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:57.909 =================================================================================================================== 00:23:57.909 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:57.909 [2024-07-15 22:52:42.737776] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:57.909 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2808371 00:23:57.909 [2024-07-15 22:52:42.766023] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:58.167 22:52:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:58.167 00:23:58.167 real 0m22.629s 00:23:58.167 user 0m29.866s 00:23:58.167 sys 0m5.375s 00:23:58.167 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:58.167 22:52:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:58.167 ************************************ 00:23:58.167 END TEST raid_rebuild_test 00:23:58.167 ************************************ 00:23:58.167 22:52:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:58.167 22:52:43 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:58.167 22:52:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:58.167 22:52:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:58.167 22:52:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:58.425 ************************************ 00:23:58.425 START TEST raid_rebuild_test_sb 00:23:58.425 ************************************ 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2811444 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2811444 /var/tmp/spdk-raid.sock 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2811444 ']' 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:58.425 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:58.425 22:52:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:58.425 [2024-07-15 22:52:43.154069] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:23:58.425 [2024-07-15 22:52:43.154142] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2811444 ] 00:23:58.425 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:58.425 Zero copy mechanism will not be used. 00:23:58.425 [2024-07-15 22:52:43.285853] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:58.684 [2024-07-15 22:52:43.394904] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:58.684 [2024-07-15 22:52:43.451478] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:58.684 [2024-07-15 22:52:43.451514] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:59.249 22:52:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:59.249 22:52:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:59.249 22:52:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:59.249 22:52:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:59.507 BaseBdev1_malloc 00:23:59.507 22:52:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:59.507 [2024-07-15 22:52:44.375756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:59.507 [2024-07-15 22:52:44.375810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.507 [2024-07-15 22:52:44.375833] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b13d40 00:23:59.507 [2024-07-15 22:52:44.375847] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.507 [2024-07-15 22:52:44.377587] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.507 [2024-07-15 22:52:44.377618] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:59.507 BaseBdev1 00:23:59.507 22:52:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:59.507 22:52:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:59.765 BaseBdev2_malloc 00:23:59.765 22:52:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:00.023 [2024-07-15 22:52:44.890124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:00.023 [2024-07-15 22:52:44.890179] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:00.023 [2024-07-15 22:52:44.890204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b14860 00:24:00.024 [2024-07-15 22:52:44.890216] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:00.024 [2024-07-15 22:52:44.891805] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:00.024 [2024-07-15 22:52:44.891836] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:00.024 BaseBdev2 00:24:00.024 22:52:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:00.280 spare_malloc 00:24:00.280 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:00.537 spare_delay 00:24:00.537 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:00.795 [2024-07-15 22:52:45.613908] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:00.795 [2024-07-15 22:52:45.613969] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:00.795 [2024-07-15 22:52:45.613990] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cc2ec0 00:24:00.795 [2024-07-15 22:52:45.614003] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:00.795 [2024-07-15 22:52:45.615606] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:00.795 [2024-07-15 22:52:45.615637] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:00.795 spare 00:24:00.795 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:01.053 [2024-07-15 22:52:45.854566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:01.053 [2024-07-15 22:52:45.855876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:01.053 [2024-07-15 22:52:45.856051] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cc4070 00:24:01.053 [2024-07-15 22:52:45.856065] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:01.053 [2024-07-15 22:52:45.856268] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cbd490 00:24:01.053 [2024-07-15 22:52:45.856411] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cc4070 00:24:01.053 [2024-07-15 22:52:45.856421] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cc4070 00:24:01.053 [2024-07-15 22:52:45.856520] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.053 22:52:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.310 22:52:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.310 "name": "raid_bdev1", 00:24:01.310 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:01.310 "strip_size_kb": 0, 00:24:01.310 "state": "online", 00:24:01.310 "raid_level": "raid1", 00:24:01.310 "superblock": true, 00:24:01.310 "num_base_bdevs": 2, 00:24:01.310 "num_base_bdevs_discovered": 2, 00:24:01.310 "num_base_bdevs_operational": 2, 00:24:01.310 "base_bdevs_list": [ 00:24:01.310 { 00:24:01.310 "name": "BaseBdev1", 00:24:01.310 "uuid": "bbb17d2c-e065-5a7b-af4c-0e00bf0e0c94", 00:24:01.310 "is_configured": true, 00:24:01.310 "data_offset": 2048, 00:24:01.310 "data_size": 63488 00:24:01.310 }, 00:24:01.310 { 00:24:01.310 "name": "BaseBdev2", 00:24:01.311 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:01.311 "is_configured": true, 00:24:01.311 "data_offset": 2048, 00:24:01.311 "data_size": 63488 00:24:01.311 } 00:24:01.311 ] 00:24:01.311 }' 00:24:01.311 22:52:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.311 22:52:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:01.874 22:52:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:01.874 22:52:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:02.131 [2024-07-15 22:52:46.945683] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:02.131 22:52:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:02.131 22:52:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.131 22:52:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:02.388 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:02.645 [2024-07-15 22:52:47.442798] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cbd490 00:24:02.645 /dev/nbd0 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:02.645 1+0 records in 00:24:02.645 1+0 records out 00:24:02.645 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254338 s, 16.1 MB/s 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:02.645 22:52:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:09.198 63488+0 records in 00:24:09.198 63488+0 records out 00:24:09.198 32505856 bytes (33 MB, 31 MiB) copied, 6.26318 s, 5.2 MB/s 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:09.198 [2024-07-15 22:52:53.974530] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:09.198 22:52:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:09.456 [2024-07-15 22:52:54.155061] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.456 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.715 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.715 "name": "raid_bdev1", 00:24:09.715 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:09.715 "strip_size_kb": 0, 00:24:09.715 "state": "online", 00:24:09.715 "raid_level": "raid1", 00:24:09.715 "superblock": true, 00:24:09.715 "num_base_bdevs": 2, 00:24:09.715 "num_base_bdevs_discovered": 1, 00:24:09.715 "num_base_bdevs_operational": 1, 00:24:09.715 "base_bdevs_list": [ 00:24:09.715 { 00:24:09.715 "name": null, 00:24:09.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.715 "is_configured": false, 00:24:09.715 "data_offset": 2048, 00:24:09.715 "data_size": 63488 00:24:09.715 }, 00:24:09.715 { 00:24:09.715 "name": "BaseBdev2", 00:24:09.715 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:09.715 "is_configured": true, 00:24:09.715 "data_offset": 2048, 00:24:09.715 "data_size": 63488 00:24:09.715 } 00:24:09.715 ] 00:24:09.715 }' 00:24:09.715 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.715 22:52:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:10.281 22:52:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:10.281 [2024-07-15 22:52:55.109589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:10.281 [2024-07-15 22:52:55.114614] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc3ce0 00:24:10.281 [2024-07-15 22:52:55.116835] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:10.281 22:52:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.659 "name": "raid_bdev1", 00:24:11.659 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:11.659 "strip_size_kb": 0, 00:24:11.659 "state": "online", 00:24:11.659 "raid_level": "raid1", 00:24:11.659 "superblock": true, 00:24:11.659 "num_base_bdevs": 2, 00:24:11.659 "num_base_bdevs_discovered": 2, 00:24:11.659 "num_base_bdevs_operational": 2, 00:24:11.659 "process": { 00:24:11.659 "type": "rebuild", 00:24:11.659 "target": "spare", 00:24:11.659 "progress": { 00:24:11.659 "blocks": 22528, 00:24:11.659 "percent": 35 00:24:11.659 } 00:24:11.659 }, 00:24:11.659 "base_bdevs_list": [ 00:24:11.659 { 00:24:11.659 "name": "spare", 00:24:11.659 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:11.659 "is_configured": true, 00:24:11.659 "data_offset": 2048, 00:24:11.659 "data_size": 63488 00:24:11.659 }, 00:24:11.659 { 00:24:11.659 "name": "BaseBdev2", 00:24:11.659 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:11.659 "is_configured": true, 00:24:11.659 "data_offset": 2048, 00:24:11.659 "data_size": 63488 00:24:11.659 } 00:24:11.659 ] 00:24:11.659 }' 00:24:11.659 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.660 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:11.660 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.660 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:11.660 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:11.919 [2024-07-15 22:52:56.670619] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.919 [2024-07-15 22:52:56.729558] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:11.919 [2024-07-15 22:52:56.729604] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.919 [2024-07-15 22:52:56.729620] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.919 [2024-07-15 22:52:56.729628] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.919 22:52:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.177 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.177 "name": "raid_bdev1", 00:24:12.177 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:12.177 "strip_size_kb": 0, 00:24:12.177 "state": "online", 00:24:12.177 "raid_level": "raid1", 00:24:12.177 "superblock": true, 00:24:12.177 "num_base_bdevs": 2, 00:24:12.177 "num_base_bdevs_discovered": 1, 00:24:12.177 "num_base_bdevs_operational": 1, 00:24:12.177 "base_bdevs_list": [ 00:24:12.177 { 00:24:12.177 "name": null, 00:24:12.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.177 "is_configured": false, 00:24:12.177 "data_offset": 2048, 00:24:12.177 "data_size": 63488 00:24:12.177 }, 00:24:12.177 { 00:24:12.177 "name": "BaseBdev2", 00:24:12.177 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:12.177 "is_configured": true, 00:24:12.177 "data_offset": 2048, 00:24:12.177 "data_size": 63488 00:24:12.177 } 00:24:12.177 ] 00:24:12.177 }' 00:24:12.177 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.177 22:52:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:13.113 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:13.113 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:13.113 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:13.113 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:13.113 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:13.113 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.113 22:52:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.373 22:52:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.373 "name": "raid_bdev1", 00:24:13.373 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:13.373 "strip_size_kb": 0, 00:24:13.373 "state": "online", 00:24:13.373 "raid_level": "raid1", 00:24:13.373 "superblock": true, 00:24:13.373 "num_base_bdevs": 2, 00:24:13.373 "num_base_bdevs_discovered": 1, 00:24:13.373 "num_base_bdevs_operational": 1, 00:24:13.373 "base_bdevs_list": [ 00:24:13.373 { 00:24:13.373 "name": null, 00:24:13.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:13.373 "is_configured": false, 00:24:13.373 "data_offset": 2048, 00:24:13.373 "data_size": 63488 00:24:13.373 }, 00:24:13.373 { 00:24:13.373 "name": "BaseBdev2", 00:24:13.373 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:13.373 "is_configured": true, 00:24:13.373 "data_offset": 2048, 00:24:13.373 "data_size": 63488 00:24:13.373 } 00:24:13.373 ] 00:24:13.373 }' 00:24:13.373 22:52:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.373 22:52:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:13.373 22:52:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.373 22:52:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:13.373 22:52:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:13.632 [2024-07-15 22:52:58.450668] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:13.632 [2024-07-15 22:52:58.456009] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc3ce0 00:24:13.632 [2024-07-15 22:52:58.457496] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:13.632 22:52:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:15.068 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:15.068 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.068 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:15.068 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:15.068 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.068 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.068 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.326 22:52:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.326 "name": "raid_bdev1", 00:24:15.326 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:15.326 "strip_size_kb": 0, 00:24:15.326 "state": "online", 00:24:15.326 "raid_level": "raid1", 00:24:15.326 "superblock": true, 00:24:15.326 "num_base_bdevs": 2, 00:24:15.326 "num_base_bdevs_discovered": 2, 00:24:15.326 "num_base_bdevs_operational": 2, 00:24:15.326 "process": { 00:24:15.326 "type": "rebuild", 00:24:15.326 "target": "spare", 00:24:15.326 "progress": { 00:24:15.326 "blocks": 28672, 00:24:15.326 "percent": 45 00:24:15.326 } 00:24:15.326 }, 00:24:15.326 "base_bdevs_list": [ 00:24:15.326 { 00:24:15.326 "name": "spare", 00:24:15.326 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:15.326 "is_configured": true, 00:24:15.326 "data_offset": 2048, 00:24:15.326 "data_size": 63488 00:24:15.326 }, 00:24:15.326 { 00:24:15.326 "name": "BaseBdev2", 00:24:15.326 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:15.326 "is_configured": true, 00:24:15.326 "data_offset": 2048, 00:24:15.326 "data_size": 63488 00:24:15.326 } 00:24:15.326 ] 00:24:15.326 }' 00:24:15.326 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.326 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:15.326 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.326 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.326 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:15.326 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:15.326 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=828 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.327 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.585 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.585 "name": "raid_bdev1", 00:24:15.585 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:15.585 "strip_size_kb": 0, 00:24:15.585 "state": "online", 00:24:15.585 "raid_level": "raid1", 00:24:15.585 "superblock": true, 00:24:15.585 "num_base_bdevs": 2, 00:24:15.585 "num_base_bdevs_discovered": 2, 00:24:15.585 "num_base_bdevs_operational": 2, 00:24:15.585 "process": { 00:24:15.585 "type": "rebuild", 00:24:15.585 "target": "spare", 00:24:15.585 "progress": { 00:24:15.585 "blocks": 36864, 00:24:15.585 "percent": 58 00:24:15.585 } 00:24:15.585 }, 00:24:15.585 "base_bdevs_list": [ 00:24:15.585 { 00:24:15.585 "name": "spare", 00:24:15.585 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:15.585 "is_configured": true, 00:24:15.585 "data_offset": 2048, 00:24:15.585 "data_size": 63488 00:24:15.585 }, 00:24:15.585 { 00:24:15.585 "name": "BaseBdev2", 00:24:15.585 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:15.585 "is_configured": true, 00:24:15.585 "data_offset": 2048, 00:24:15.585 "data_size": 63488 00:24:15.585 } 00:24:15.585 ] 00:24:15.585 }' 00:24:15.585 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.586 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:15.586 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.586 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:15.586 22:53:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.964 [2024-07-15 22:53:01.581398] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:16.964 [2024-07-15 22:53:01.581463] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:16.964 [2024-07-15 22:53:01.581557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.964 "name": "raid_bdev1", 00:24:16.964 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:16.964 "strip_size_kb": 0, 00:24:16.964 "state": "online", 00:24:16.964 "raid_level": "raid1", 00:24:16.964 "superblock": true, 00:24:16.964 "num_base_bdevs": 2, 00:24:16.964 "num_base_bdevs_discovered": 2, 00:24:16.964 "num_base_bdevs_operational": 2, 00:24:16.964 "base_bdevs_list": [ 00:24:16.964 { 00:24:16.964 "name": "spare", 00:24:16.964 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:16.964 "is_configured": true, 00:24:16.964 "data_offset": 2048, 00:24:16.964 "data_size": 63488 00:24:16.964 }, 00:24:16.964 { 00:24:16.964 "name": "BaseBdev2", 00:24:16.964 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:16.964 "is_configured": true, 00:24:16.964 "data_offset": 2048, 00:24:16.964 "data_size": 63488 00:24:16.964 } 00:24:16.964 ] 00:24:16.964 }' 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:16.964 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.222 22:53:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:17.481 "name": "raid_bdev1", 00:24:17.481 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:17.481 "strip_size_kb": 0, 00:24:17.481 "state": "online", 00:24:17.481 "raid_level": "raid1", 00:24:17.481 "superblock": true, 00:24:17.481 "num_base_bdevs": 2, 00:24:17.481 "num_base_bdevs_discovered": 2, 00:24:17.481 "num_base_bdevs_operational": 2, 00:24:17.481 "base_bdevs_list": [ 00:24:17.481 { 00:24:17.481 "name": "spare", 00:24:17.481 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:17.481 "is_configured": true, 00:24:17.481 "data_offset": 2048, 00:24:17.481 "data_size": 63488 00:24:17.481 }, 00:24:17.481 { 00:24:17.481 "name": "BaseBdev2", 00:24:17.481 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:17.481 "is_configured": true, 00:24:17.481 "data_offset": 2048, 00:24:17.481 "data_size": 63488 00:24:17.481 } 00:24:17.481 ] 00:24:17.481 }' 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.481 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.740 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.740 "name": "raid_bdev1", 00:24:17.740 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:17.740 "strip_size_kb": 0, 00:24:17.740 "state": "online", 00:24:17.740 "raid_level": "raid1", 00:24:17.740 "superblock": true, 00:24:17.740 "num_base_bdevs": 2, 00:24:17.740 "num_base_bdevs_discovered": 2, 00:24:17.740 "num_base_bdevs_operational": 2, 00:24:17.740 "base_bdevs_list": [ 00:24:17.740 { 00:24:17.740 "name": "spare", 00:24:17.740 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:17.740 "is_configured": true, 00:24:17.740 "data_offset": 2048, 00:24:17.740 "data_size": 63488 00:24:17.740 }, 00:24:17.740 { 00:24:17.740 "name": "BaseBdev2", 00:24:17.740 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:17.740 "is_configured": true, 00:24:17.740 "data_offset": 2048, 00:24:17.740 "data_size": 63488 00:24:17.740 } 00:24:17.740 ] 00:24:17.740 }' 00:24:17.740 22:53:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.740 22:53:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:18.306 22:53:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:18.870 [2024-07-15 22:53:03.660189] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:18.870 [2024-07-15 22:53:03.660219] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:18.870 [2024-07-15 22:53:03.660291] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:18.870 [2024-07-15 22:53:03.660349] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:18.870 [2024-07-15 22:53:03.660360] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cc4070 name raid_bdev1, state offline 00:24:18.870 22:53:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.870 22:53:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:19.436 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:19.695 /dev/nbd0 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:19.695 1+0 records in 00:24:19.695 1+0 records out 00:24:19.695 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229857 s, 17.8 MB/s 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:19.695 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:19.953 /dev/nbd1 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:19.953 1+0 records in 00:24:19.953 1+0 records out 00:24:19.953 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337444 s, 12.1 MB/s 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:19.953 22:53:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:20.211 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:20.469 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:20.470 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:20.470 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:20.727 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:20.985 [2024-07-15 22:53:05.749044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:20.985 [2024-07-15 22:53:05.749093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.985 [2024-07-15 22:53:05.749114] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cc3500 00:24:20.985 [2024-07-15 22:53:05.749127] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.985 [2024-07-15 22:53:05.750760] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.985 [2024-07-15 22:53:05.750791] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:20.985 [2024-07-15 22:53:05.750874] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:20.985 [2024-07-15 22:53:05.750902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:20.985 [2024-07-15 22:53:05.751014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:20.985 spare 00:24:20.985 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:20.985 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:20.986 22:53:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.986 [2024-07-15 22:53:05.851328] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cc42f0 00:24:20.986 [2024-07-15 22:53:05.851345] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:20.986 [2024-07-15 22:53:05.851544] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cbcf50 00:24:20.986 [2024-07-15 22:53:05.851690] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cc42f0 00:24:20.986 [2024-07-15 22:53:05.851707] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cc42f0 00:24:20.986 [2024-07-15 22:53:05.851811] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.243 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:21.243 "name": "raid_bdev1", 00:24:21.243 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:21.243 "strip_size_kb": 0, 00:24:21.243 "state": "online", 00:24:21.243 "raid_level": "raid1", 00:24:21.243 "superblock": true, 00:24:21.243 "num_base_bdevs": 2, 00:24:21.243 "num_base_bdevs_discovered": 2, 00:24:21.243 "num_base_bdevs_operational": 2, 00:24:21.243 "base_bdevs_list": [ 00:24:21.243 { 00:24:21.243 "name": "spare", 00:24:21.243 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:21.243 "is_configured": true, 00:24:21.243 "data_offset": 2048, 00:24:21.243 "data_size": 63488 00:24:21.243 }, 00:24:21.243 { 00:24:21.243 "name": "BaseBdev2", 00:24:21.243 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:21.243 "is_configured": true, 00:24:21.243 "data_offset": 2048, 00:24:21.243 "data_size": 63488 00:24:21.243 } 00:24:21.243 ] 00:24:21.243 }' 00:24:21.243 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:21.243 22:53:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:21.809 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:21.809 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:21.809 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:21.809 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:21.809 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:21.809 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.809 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.067 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:22.067 "name": "raid_bdev1", 00:24:22.067 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:22.067 "strip_size_kb": 0, 00:24:22.067 "state": "online", 00:24:22.068 "raid_level": "raid1", 00:24:22.068 "superblock": true, 00:24:22.068 "num_base_bdevs": 2, 00:24:22.068 "num_base_bdevs_discovered": 2, 00:24:22.068 "num_base_bdevs_operational": 2, 00:24:22.068 "base_bdevs_list": [ 00:24:22.068 { 00:24:22.068 "name": "spare", 00:24:22.068 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:22.068 "is_configured": true, 00:24:22.068 "data_offset": 2048, 00:24:22.068 "data_size": 63488 00:24:22.068 }, 00:24:22.068 { 00:24:22.068 "name": "BaseBdev2", 00:24:22.068 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:22.068 "is_configured": true, 00:24:22.068 "data_offset": 2048, 00:24:22.068 "data_size": 63488 00:24:22.068 } 00:24:22.068 ] 00:24:22.068 }' 00:24:22.068 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:22.068 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:22.068 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:22.068 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:22.326 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.326 22:53:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:22.326 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:22.326 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:22.585 [2024-07-15 22:53:07.445657] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.585 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.844 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:22.844 "name": "raid_bdev1", 00:24:22.844 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:22.844 "strip_size_kb": 0, 00:24:22.844 "state": "online", 00:24:22.844 "raid_level": "raid1", 00:24:22.844 "superblock": true, 00:24:22.844 "num_base_bdevs": 2, 00:24:22.844 "num_base_bdevs_discovered": 1, 00:24:22.844 "num_base_bdevs_operational": 1, 00:24:22.844 "base_bdevs_list": [ 00:24:22.844 { 00:24:22.844 "name": null, 00:24:22.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:22.844 "is_configured": false, 00:24:22.844 "data_offset": 2048, 00:24:22.844 "data_size": 63488 00:24:22.844 }, 00:24:22.844 { 00:24:22.844 "name": "BaseBdev2", 00:24:22.844 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:22.844 "is_configured": true, 00:24:22.844 "data_offset": 2048, 00:24:22.844 "data_size": 63488 00:24:22.844 } 00:24:22.844 ] 00:24:22.844 }' 00:24:22.844 22:53:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:22.844 22:53:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:23.780 22:53:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:23.780 [2024-07-15 22:53:08.572677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:23.780 [2024-07-15 22:53:08.572829] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:23.780 [2024-07-15 22:53:08.572845] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:23.780 [2024-07-15 22:53:08.572874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:23.780 [2024-07-15 22:53:08.577674] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc63d0 00:24:23.780 [2024-07-15 22:53:08.580100] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:23.780 22:53:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:24.715 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:24.715 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:24.715 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:24.715 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:24.715 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:24.715 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:24.715 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:24.972 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:24.972 "name": "raid_bdev1", 00:24:24.972 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:24.972 "strip_size_kb": 0, 00:24:24.972 "state": "online", 00:24:24.972 "raid_level": "raid1", 00:24:24.972 "superblock": true, 00:24:24.972 "num_base_bdevs": 2, 00:24:24.972 "num_base_bdevs_discovered": 2, 00:24:24.972 "num_base_bdevs_operational": 2, 00:24:24.972 "process": { 00:24:24.972 "type": "rebuild", 00:24:24.972 "target": "spare", 00:24:24.972 "progress": { 00:24:24.972 "blocks": 24576, 00:24:24.972 "percent": 38 00:24:24.972 } 00:24:24.972 }, 00:24:24.972 "base_bdevs_list": [ 00:24:24.972 { 00:24:24.972 "name": "spare", 00:24:24.972 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:24.972 "is_configured": true, 00:24:24.972 "data_offset": 2048, 00:24:24.972 "data_size": 63488 00:24:24.972 }, 00:24:24.972 { 00:24:24.972 "name": "BaseBdev2", 00:24:24.972 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:24.972 "is_configured": true, 00:24:24.972 "data_offset": 2048, 00:24:24.972 "data_size": 63488 00:24:24.972 } 00:24:24.972 ] 00:24:24.972 }' 00:24:24.972 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.229 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:25.229 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.229 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:25.229 22:53:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:25.487 [2024-07-15 22:53:10.158353] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:25.487 [2024-07-15 22:53:10.192856] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:25.487 [2024-07-15 22:53:10.192901] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:25.487 [2024-07-15 22:53:10.192915] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:25.487 [2024-07-15 22:53:10.192924] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.487 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.488 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.488 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.746 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.746 "name": "raid_bdev1", 00:24:25.746 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:25.746 "strip_size_kb": 0, 00:24:25.746 "state": "online", 00:24:25.746 "raid_level": "raid1", 00:24:25.746 "superblock": true, 00:24:25.746 "num_base_bdevs": 2, 00:24:25.746 "num_base_bdevs_discovered": 1, 00:24:25.746 "num_base_bdevs_operational": 1, 00:24:25.746 "base_bdevs_list": [ 00:24:25.746 { 00:24:25.746 "name": null, 00:24:25.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.746 "is_configured": false, 00:24:25.746 "data_offset": 2048, 00:24:25.746 "data_size": 63488 00:24:25.746 }, 00:24:25.746 { 00:24:25.746 "name": "BaseBdev2", 00:24:25.746 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:25.746 "is_configured": true, 00:24:25.746 "data_offset": 2048, 00:24:25.746 "data_size": 63488 00:24:25.746 } 00:24:25.746 ] 00:24:25.746 }' 00:24:25.746 22:53:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.746 22:53:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:26.311 22:53:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:26.569 [2024-07-15 22:53:11.236027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:26.569 [2024-07-15 22:53:11.236082] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:26.569 [2024-07-15 22:53:11.236110] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cbe650 00:24:26.569 [2024-07-15 22:53:11.236124] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:26.569 [2024-07-15 22:53:11.236510] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:26.569 [2024-07-15 22:53:11.236528] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:26.569 [2024-07-15 22:53:11.236614] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:26.569 [2024-07-15 22:53:11.236626] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:26.569 [2024-07-15 22:53:11.236637] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:26.569 [2024-07-15 22:53:11.236657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:26.569 [2024-07-15 22:53:11.241552] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cc63d0 00:24:26.569 spare 00:24:26.569 [2024-07-15 22:53:11.243010] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:26.569 22:53:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:27.503 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:27.503 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.503 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:27.503 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:27.503 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.503 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.503 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.761 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.761 "name": "raid_bdev1", 00:24:27.761 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:27.761 "strip_size_kb": 0, 00:24:27.761 "state": "online", 00:24:27.761 "raid_level": "raid1", 00:24:27.761 "superblock": true, 00:24:27.761 "num_base_bdevs": 2, 00:24:27.761 "num_base_bdevs_discovered": 2, 00:24:27.761 "num_base_bdevs_operational": 2, 00:24:27.761 "process": { 00:24:27.761 "type": "rebuild", 00:24:27.761 "target": "spare", 00:24:27.761 "progress": { 00:24:27.761 "blocks": 24576, 00:24:27.761 "percent": 38 00:24:27.761 } 00:24:27.761 }, 00:24:27.761 "base_bdevs_list": [ 00:24:27.761 { 00:24:27.761 "name": "spare", 00:24:27.761 "uuid": "218e4c59-179f-5f6b-951e-50365e59cabc", 00:24:27.761 "is_configured": true, 00:24:27.761 "data_offset": 2048, 00:24:27.761 "data_size": 63488 00:24:27.761 }, 00:24:27.761 { 00:24:27.761 "name": "BaseBdev2", 00:24:27.761 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:27.761 "is_configured": true, 00:24:27.761 "data_offset": 2048, 00:24:27.761 "data_size": 63488 00:24:27.761 } 00:24:27.761 ] 00:24:27.761 }' 00:24:27.761 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.761 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:27.761 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.761 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:27.761 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:28.019 [2024-07-15 22:53:12.826037] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:28.019 [2024-07-15 22:53:12.855422] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:28.019 [2024-07-15 22:53:12.855466] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:28.019 [2024-07-15 22:53:12.855481] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:28.019 [2024-07-15 22:53:12.855490] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.019 22:53:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.277 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:28.277 "name": "raid_bdev1", 00:24:28.277 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:28.277 "strip_size_kb": 0, 00:24:28.277 "state": "online", 00:24:28.277 "raid_level": "raid1", 00:24:28.277 "superblock": true, 00:24:28.277 "num_base_bdevs": 2, 00:24:28.277 "num_base_bdevs_discovered": 1, 00:24:28.277 "num_base_bdevs_operational": 1, 00:24:28.277 "base_bdevs_list": [ 00:24:28.277 { 00:24:28.277 "name": null, 00:24:28.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.277 "is_configured": false, 00:24:28.277 "data_offset": 2048, 00:24:28.277 "data_size": 63488 00:24:28.277 }, 00:24:28.277 { 00:24:28.277 "name": "BaseBdev2", 00:24:28.277 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:28.277 "is_configured": true, 00:24:28.277 "data_offset": 2048, 00:24:28.277 "data_size": 63488 00:24:28.277 } 00:24:28.277 ] 00:24:28.277 }' 00:24:28.277 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:28.277 22:53:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:28.844 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:28.844 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:28.844 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:28.844 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:28.844 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:28.844 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.844 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.102 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.102 "name": "raid_bdev1", 00:24:29.102 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:29.102 "strip_size_kb": 0, 00:24:29.102 "state": "online", 00:24:29.102 "raid_level": "raid1", 00:24:29.102 "superblock": true, 00:24:29.102 "num_base_bdevs": 2, 00:24:29.102 "num_base_bdevs_discovered": 1, 00:24:29.102 "num_base_bdevs_operational": 1, 00:24:29.102 "base_bdevs_list": [ 00:24:29.102 { 00:24:29.102 "name": null, 00:24:29.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.102 "is_configured": false, 00:24:29.102 "data_offset": 2048, 00:24:29.102 "data_size": 63488 00:24:29.102 }, 00:24:29.102 { 00:24:29.102 "name": "BaseBdev2", 00:24:29.102 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:29.102 "is_configured": true, 00:24:29.102 "data_offset": 2048, 00:24:29.102 "data_size": 63488 00:24:29.102 } 00:24:29.102 ] 00:24:29.102 }' 00:24:29.102 22:53:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.360 22:53:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:29.360 22:53:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.360 22:53:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:29.360 22:53:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:29.619 22:53:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:29.880 [2024-07-15 22:53:14.560424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:29.880 [2024-07-15 22:53:14.560475] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.880 [2024-07-15 22:53:14.560496] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cbb600 00:24:29.880 [2024-07-15 22:53:14.560509] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.880 [2024-07-15 22:53:14.560862] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.880 [2024-07-15 22:53:14.560882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:29.880 [2024-07-15 22:53:14.560959] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:29.880 [2024-07-15 22:53:14.560972] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:29.880 [2024-07-15 22:53:14.560983] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:29.880 BaseBdev1 00:24:29.880 22:53:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.891 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.150 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:31.150 "name": "raid_bdev1", 00:24:31.150 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:31.150 "strip_size_kb": 0, 00:24:31.150 "state": "online", 00:24:31.150 "raid_level": "raid1", 00:24:31.150 "superblock": true, 00:24:31.150 "num_base_bdevs": 2, 00:24:31.150 "num_base_bdevs_discovered": 1, 00:24:31.150 "num_base_bdevs_operational": 1, 00:24:31.150 "base_bdevs_list": [ 00:24:31.150 { 00:24:31.150 "name": null, 00:24:31.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.150 "is_configured": false, 00:24:31.150 "data_offset": 2048, 00:24:31.150 "data_size": 63488 00:24:31.150 }, 00:24:31.150 { 00:24:31.150 "name": "BaseBdev2", 00:24:31.150 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:31.150 "is_configured": true, 00:24:31.150 "data_offset": 2048, 00:24:31.150 "data_size": 63488 00:24:31.150 } 00:24:31.150 ] 00:24:31.150 }' 00:24:31.150 22:53:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:31.150 22:53:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:31.717 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:31.717 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.717 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:31.717 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:31.717 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.717 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.717 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:31.977 "name": "raid_bdev1", 00:24:31.977 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:31.977 "strip_size_kb": 0, 00:24:31.977 "state": "online", 00:24:31.977 "raid_level": "raid1", 00:24:31.977 "superblock": true, 00:24:31.977 "num_base_bdevs": 2, 00:24:31.977 "num_base_bdevs_discovered": 1, 00:24:31.977 "num_base_bdevs_operational": 1, 00:24:31.977 "base_bdevs_list": [ 00:24:31.977 { 00:24:31.977 "name": null, 00:24:31.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.977 "is_configured": false, 00:24:31.977 "data_offset": 2048, 00:24:31.977 "data_size": 63488 00:24:31.977 }, 00:24:31.977 { 00:24:31.977 "name": "BaseBdev2", 00:24:31.977 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:31.977 "is_configured": true, 00:24:31.977 "data_offset": 2048, 00:24:31.977 "data_size": 63488 00:24:31.977 } 00:24:31.977 ] 00:24:31.977 }' 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:31.977 22:53:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:32.236 [2024-07-15 22:53:17.014955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:32.236 [2024-07-15 22:53:17.015099] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:32.236 [2024-07-15 22:53:17.015114] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:32.236 request: 00:24:32.236 { 00:24:32.236 "base_bdev": "BaseBdev1", 00:24:32.236 "raid_bdev": "raid_bdev1", 00:24:32.236 "method": "bdev_raid_add_base_bdev", 00:24:32.236 "req_id": 1 00:24:32.236 } 00:24:32.236 Got JSON-RPC error response 00:24:32.236 response: 00:24:32.236 { 00:24:32.236 "code": -22, 00:24:32.236 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:32.236 } 00:24:32.236 22:53:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:24:32.236 22:53:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:32.236 22:53:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:32.236 22:53:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:32.236 22:53:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.172 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.431 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.431 "name": "raid_bdev1", 00:24:33.431 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:33.431 "strip_size_kb": 0, 00:24:33.431 "state": "online", 00:24:33.431 "raid_level": "raid1", 00:24:33.431 "superblock": true, 00:24:33.431 "num_base_bdevs": 2, 00:24:33.431 "num_base_bdevs_discovered": 1, 00:24:33.431 "num_base_bdevs_operational": 1, 00:24:33.431 "base_bdevs_list": [ 00:24:33.431 { 00:24:33.431 "name": null, 00:24:33.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.431 "is_configured": false, 00:24:33.431 "data_offset": 2048, 00:24:33.431 "data_size": 63488 00:24:33.431 }, 00:24:33.431 { 00:24:33.431 "name": "BaseBdev2", 00:24:33.431 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:33.431 "is_configured": true, 00:24:33.431 "data_offset": 2048, 00:24:33.431 "data_size": 63488 00:24:33.431 } 00:24:33.431 ] 00:24:33.431 }' 00:24:33.431 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.431 22:53:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:34.008 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:34.008 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:34.008 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:34.008 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:34.009 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:34.009 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.009 22:53:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.267 22:53:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:34.267 "name": "raid_bdev1", 00:24:34.267 "uuid": "1de36863-1302-45e6-85e3-2086342b90cc", 00:24:34.267 "strip_size_kb": 0, 00:24:34.267 "state": "online", 00:24:34.267 "raid_level": "raid1", 00:24:34.267 "superblock": true, 00:24:34.267 "num_base_bdevs": 2, 00:24:34.267 "num_base_bdevs_discovered": 1, 00:24:34.267 "num_base_bdevs_operational": 1, 00:24:34.267 "base_bdevs_list": [ 00:24:34.267 { 00:24:34.267 "name": null, 00:24:34.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.267 "is_configured": false, 00:24:34.267 "data_offset": 2048, 00:24:34.267 "data_size": 63488 00:24:34.267 }, 00:24:34.267 { 00:24:34.267 "name": "BaseBdev2", 00:24:34.267 "uuid": "7793f817-f653-5510-905c-812e5d643ffd", 00:24:34.267 "is_configured": true, 00:24:34.267 "data_offset": 2048, 00:24:34.267 "data_size": 63488 00:24:34.267 } 00:24:34.267 ] 00:24:34.267 }' 00:24:34.267 22:53:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2811444 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2811444 ']' 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2811444 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2811444 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2811444' 00:24:34.526 killing process with pid 2811444 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2811444 00:24:34.526 Received shutdown signal, test time was about 60.000000 seconds 00:24:34.526 00:24:34.526 Latency(us) 00:24:34.526 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:34.526 =================================================================================================================== 00:24:34.526 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:34.526 [2024-07-15 22:53:19.291886] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:34.526 [2024-07-15 22:53:19.291995] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:34.526 [2024-07-15 22:53:19.292044] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:34.526 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2811444 00:24:34.526 [2024-07-15 22:53:19.292056] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cc42f0 name raid_bdev1, state offline 00:24:34.526 [2024-07-15 22:53:19.323607] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:34.785 22:53:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:34.785 00:24:34.785 real 0m36.469s 00:24:34.785 user 0m52.676s 00:24:34.785 sys 0m7.306s 00:24:34.785 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:34.785 22:53:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:34.785 ************************************ 00:24:34.785 END TEST raid_rebuild_test_sb 00:24:34.785 ************************************ 00:24:34.785 22:53:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:34.785 22:53:19 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:24:34.785 22:53:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:34.785 22:53:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:34.786 22:53:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:34.786 ************************************ 00:24:34.786 START TEST raid_rebuild_test_io 00:24:34.786 ************************************ 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2816589 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2816589 /var/tmp/spdk-raid.sock 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2816589 ']' 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:34.786 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:34.786 22:53:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:35.045 [2024-07-15 22:53:19.717440] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:24:35.045 [2024-07-15 22:53:19.717511] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2816589 ] 00:24:35.045 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:35.045 Zero copy mechanism will not be used. 00:24:35.045 [2024-07-15 22:53:19.847862] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:35.303 [2024-07-15 22:53:19.954727] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:35.303 [2024-07-15 22:53:20.027091] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:35.303 [2024-07-15 22:53:20.027127] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:35.870 22:53:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:35.870 22:53:20 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:24:35.870 22:53:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:35.870 22:53:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:36.129 BaseBdev1_malloc 00:24:36.129 22:53:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:36.388 [2024-07-15 22:53:21.124448] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:36.388 [2024-07-15 22:53:21.124495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:36.388 [2024-07-15 22:53:21.124518] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8ccd40 00:24:36.388 [2024-07-15 22:53:21.124530] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:36.388 [2024-07-15 22:53:21.126249] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:36.388 [2024-07-15 22:53:21.126281] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:36.388 BaseBdev1 00:24:36.388 22:53:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:36.388 22:53:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:36.647 BaseBdev2_malloc 00:24:36.647 22:53:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:36.906 [2024-07-15 22:53:21.611828] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:36.906 [2024-07-15 22:53:21.611877] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:36.906 [2024-07-15 22:53:21.611901] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8cd860 00:24:36.906 [2024-07-15 22:53:21.611914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:36.906 [2024-07-15 22:53:21.613508] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:36.906 [2024-07-15 22:53:21.613538] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:36.906 BaseBdev2 00:24:36.906 22:53:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:37.165 spare_malloc 00:24:37.165 22:53:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:37.424 spare_delay 00:24:37.424 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:37.683 [2024-07-15 22:53:22.343608] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:37.683 [2024-07-15 22:53:22.343653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:37.683 [2024-07-15 22:53:22.343673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa7bec0 00:24:37.683 [2024-07-15 22:53:22.343685] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:37.683 [2024-07-15 22:53:22.345270] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:37.683 [2024-07-15 22:53:22.345301] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:37.683 spare 00:24:37.683 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:37.683 [2024-07-15 22:53:22.584255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:37.683 [2024-07-15 22:53:22.585559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:37.683 [2024-07-15 22:53:22.585636] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa7d070 00:24:37.683 [2024-07-15 22:53:22.585652] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:37.683 [2024-07-15 22:53:22.585861] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa76490 00:24:37.683 [2024-07-15 22:53:22.586012] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa7d070 00:24:37.683 [2024-07-15 22:53:22.586023] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa7d070 00:24:37.683 [2024-07-15 22:53:22.586136] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:37.942 "name": "raid_bdev1", 00:24:37.942 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:37.942 "strip_size_kb": 0, 00:24:37.942 "state": "online", 00:24:37.942 "raid_level": "raid1", 00:24:37.942 "superblock": false, 00:24:37.942 "num_base_bdevs": 2, 00:24:37.942 "num_base_bdevs_discovered": 2, 00:24:37.942 "num_base_bdevs_operational": 2, 00:24:37.942 "base_bdevs_list": [ 00:24:37.942 { 00:24:37.942 "name": "BaseBdev1", 00:24:37.942 "uuid": "906ff77c-4ffa-5657-8414-6383033532fb", 00:24:37.942 "is_configured": true, 00:24:37.942 "data_offset": 0, 00:24:37.942 "data_size": 65536 00:24:37.942 }, 00:24:37.942 { 00:24:37.942 "name": "BaseBdev2", 00:24:37.942 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:37.942 "is_configured": true, 00:24:37.942 "data_offset": 0, 00:24:37.942 "data_size": 65536 00:24:37.942 } 00:24:37.942 ] 00:24:37.942 }' 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:37.942 22:53:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:38.881 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:38.881 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:38.881 [2024-07-15 22:53:23.707446] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:38.881 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:38.881 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.881 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:39.140 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:39.140 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:39.140 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:39.140 22:53:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:39.399 [2024-07-15 22:53:24.094355] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa77bd0 00:24:39.399 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:39.399 Zero copy mechanism will not be used. 00:24:39.399 Running I/O for 60 seconds... 00:24:39.399 [2024-07-15 22:53:24.208588] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:39.399 [2024-07-15 22:53:24.224741] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa77bd0 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.399 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.657 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.657 "name": "raid_bdev1", 00:24:39.657 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:39.657 "strip_size_kb": 0, 00:24:39.657 "state": "online", 00:24:39.657 "raid_level": "raid1", 00:24:39.657 "superblock": false, 00:24:39.657 "num_base_bdevs": 2, 00:24:39.657 "num_base_bdevs_discovered": 1, 00:24:39.657 "num_base_bdevs_operational": 1, 00:24:39.657 "base_bdevs_list": [ 00:24:39.657 { 00:24:39.657 "name": null, 00:24:39.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.657 "is_configured": false, 00:24:39.657 "data_offset": 0, 00:24:39.657 "data_size": 65536 00:24:39.657 }, 00:24:39.657 { 00:24:39.657 "name": "BaseBdev2", 00:24:39.657 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:39.657 "is_configured": true, 00:24:39.657 "data_offset": 0, 00:24:39.657 "data_size": 65536 00:24:39.657 } 00:24:39.657 ] 00:24:39.657 }' 00:24:39.657 22:53:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.657 22:53:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:40.594 22:53:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:40.594 [2024-07-15 22:53:25.376851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:40.594 [2024-07-15 22:53:25.403590] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ff910 00:24:40.594 [2024-07-15 22:53:25.405966] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:40.594 22:53:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:40.852 [2024-07-15 22:53:25.540778] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:40.852 [2024-07-15 22:53:25.541095] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:41.111 [2024-07-15 22:53:25.766909] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:41.370 [2024-07-15 22:53:26.131384] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:41.370 [2024-07-15 22:53:26.131660] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:41.628 [2024-07-15 22:53:26.341348] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:41.628 [2024-07-15 22:53:26.341552] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:41.628 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:41.628 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.628 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:41.628 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:41.628 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.628 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.628 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.887 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.887 "name": "raid_bdev1", 00:24:41.887 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:41.887 "strip_size_kb": 0, 00:24:41.887 "state": "online", 00:24:41.887 "raid_level": "raid1", 00:24:41.887 "superblock": false, 00:24:41.887 "num_base_bdevs": 2, 00:24:41.887 "num_base_bdevs_discovered": 2, 00:24:41.887 "num_base_bdevs_operational": 2, 00:24:41.887 "process": { 00:24:41.887 "type": "rebuild", 00:24:41.887 "target": "spare", 00:24:41.887 "progress": { 00:24:41.887 "blocks": 12288, 00:24:41.887 "percent": 18 00:24:41.887 } 00:24:41.887 }, 00:24:41.887 "base_bdevs_list": [ 00:24:41.887 { 00:24:41.887 "name": "spare", 00:24:41.887 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:41.887 "is_configured": true, 00:24:41.887 "data_offset": 0, 00:24:41.887 "data_size": 65536 00:24:41.887 }, 00:24:41.887 { 00:24:41.887 "name": "BaseBdev2", 00:24:41.887 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:41.887 "is_configured": true, 00:24:41.887 "data_offset": 0, 00:24:41.887 "data_size": 65536 00:24:41.887 } 00:24:41.887 ] 00:24:41.887 }' 00:24:41.887 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.887 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:41.887 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.887 [2024-07-15 22:53:26.716723] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:41.887 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:41.887 22:53:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:42.146 [2024-07-15 22:53:26.979445] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:42.405 [2024-07-15 22:53:27.194016] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:42.405 [2024-07-15 22:53:27.203719] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:42.405 [2024-07-15 22:53:27.203747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:42.405 [2024-07-15 22:53:27.203758] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:42.405 [2024-07-15 22:53:27.233757] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xa77bd0 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:42.405 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:42.664 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:42.664 "name": "raid_bdev1", 00:24:42.664 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:42.664 "strip_size_kb": 0, 00:24:42.664 "state": "online", 00:24:42.664 "raid_level": "raid1", 00:24:42.664 "superblock": false, 00:24:42.664 "num_base_bdevs": 2, 00:24:42.664 "num_base_bdevs_discovered": 1, 00:24:42.664 "num_base_bdevs_operational": 1, 00:24:42.664 "base_bdevs_list": [ 00:24:42.664 { 00:24:42.664 "name": null, 00:24:42.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:42.664 "is_configured": false, 00:24:42.664 "data_offset": 0, 00:24:42.664 "data_size": 65536 00:24:42.664 }, 00:24:42.664 { 00:24:42.664 "name": "BaseBdev2", 00:24:42.664 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:42.664 "is_configured": true, 00:24:42.664 "data_offset": 0, 00:24:42.664 "data_size": 65536 00:24:42.664 } 00:24:42.664 ] 00:24:42.664 }' 00:24:42.664 22:53:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:42.664 22:53:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.601 "name": "raid_bdev1", 00:24:43.601 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:43.601 "strip_size_kb": 0, 00:24:43.601 "state": "online", 00:24:43.601 "raid_level": "raid1", 00:24:43.601 "superblock": false, 00:24:43.601 "num_base_bdevs": 2, 00:24:43.601 "num_base_bdevs_discovered": 1, 00:24:43.601 "num_base_bdevs_operational": 1, 00:24:43.601 "base_bdevs_list": [ 00:24:43.601 { 00:24:43.601 "name": null, 00:24:43.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.601 "is_configured": false, 00:24:43.601 "data_offset": 0, 00:24:43.601 "data_size": 65536 00:24:43.601 }, 00:24:43.601 { 00:24:43.601 "name": "BaseBdev2", 00:24:43.601 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:43.601 "is_configured": true, 00:24:43.601 "data_offset": 0, 00:24:43.601 "data_size": 65536 00:24:43.601 } 00:24:43.601 ] 00:24:43.601 }' 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:43.601 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:43.860 [2024-07-15 22:53:28.670223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:43.860 [2024-07-15 22:53:28.720865] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8df230 00:24:43.860 22:53:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:43.860 [2024-07-15 22:53:28.722337] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:44.119 [2024-07-15 22:53:28.845992] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:44.377 [2024-07-15 22:53:29.089745] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:44.377 [2024-07-15 22:53:29.089975] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:45.000 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:45.000 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.001 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:45.001 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:45.001 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.001 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.001 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.001 [2024-07-15 22:53:29.862201] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:45.001 [2024-07-15 22:53:29.862728] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:45.259 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.259 "name": "raid_bdev1", 00:24:45.259 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:45.259 "strip_size_kb": 0, 00:24:45.259 "state": "online", 00:24:45.259 "raid_level": "raid1", 00:24:45.259 "superblock": false, 00:24:45.259 "num_base_bdevs": 2, 00:24:45.259 "num_base_bdevs_discovered": 2, 00:24:45.259 "num_base_bdevs_operational": 2, 00:24:45.259 "process": { 00:24:45.259 "type": "rebuild", 00:24:45.259 "target": "spare", 00:24:45.259 "progress": { 00:24:45.259 "blocks": 14336, 00:24:45.259 "percent": 21 00:24:45.259 } 00:24:45.259 }, 00:24:45.259 "base_bdevs_list": [ 00:24:45.259 { 00:24:45.259 "name": "spare", 00:24:45.259 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:45.259 "is_configured": true, 00:24:45.259 "data_offset": 0, 00:24:45.259 "data_size": 65536 00:24:45.259 }, 00:24:45.259 { 00:24:45.259 "name": "BaseBdev2", 00:24:45.259 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:45.259 "is_configured": true, 00:24:45.259 "data_offset": 0, 00:24:45.259 "data_size": 65536 00:24:45.259 } 00:24:45.259 ] 00:24:45.259 }' 00:24:45.259 22:53:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=858 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.259 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.259 [2024-07-15 22:53:30.082730] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:45.259 [2024-07-15 22:53:30.083017] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:45.518 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.518 "name": "raid_bdev1", 00:24:45.518 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:45.518 "strip_size_kb": 0, 00:24:45.518 "state": "online", 00:24:45.518 "raid_level": "raid1", 00:24:45.518 "superblock": false, 00:24:45.518 "num_base_bdevs": 2, 00:24:45.518 "num_base_bdevs_discovered": 2, 00:24:45.518 "num_base_bdevs_operational": 2, 00:24:45.518 "process": { 00:24:45.518 "type": "rebuild", 00:24:45.518 "target": "spare", 00:24:45.518 "progress": { 00:24:45.518 "blocks": 18432, 00:24:45.518 "percent": 28 00:24:45.518 } 00:24:45.518 }, 00:24:45.518 "base_bdevs_list": [ 00:24:45.518 { 00:24:45.518 "name": "spare", 00:24:45.518 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:45.518 "is_configured": true, 00:24:45.518 "data_offset": 0, 00:24:45.518 "data_size": 65536 00:24:45.518 }, 00:24:45.518 { 00:24:45.518 "name": "BaseBdev2", 00:24:45.518 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:45.518 "is_configured": true, 00:24:45.518 "data_offset": 0, 00:24:45.518 "data_size": 65536 00:24:45.518 } 00:24:45.518 ] 00:24:45.518 }' 00:24:45.518 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.518 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:45.518 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.518 [2024-07-15 22:53:30.417138] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:45.777 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:45.777 22:53:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:45.777 [2024-07-15 22:53:30.535129] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:46.345 [2024-07-15 22:53:31.054173] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:46.603 [2024-07-15 22:53:31.393177] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.603 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.603 [2024-07-15 22:53:31.502415] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:46.862 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:46.862 "name": "raid_bdev1", 00:24:46.862 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:46.862 "strip_size_kb": 0, 00:24:46.862 "state": "online", 00:24:46.862 "raid_level": "raid1", 00:24:46.862 "superblock": false, 00:24:46.862 "num_base_bdevs": 2, 00:24:46.862 "num_base_bdevs_discovered": 2, 00:24:46.862 "num_base_bdevs_operational": 2, 00:24:46.862 "process": { 00:24:46.862 "type": "rebuild", 00:24:46.862 "target": "spare", 00:24:46.862 "progress": { 00:24:46.862 "blocks": 34816, 00:24:46.862 "percent": 53 00:24:46.862 } 00:24:46.862 }, 00:24:46.862 "base_bdevs_list": [ 00:24:46.862 { 00:24:46.862 "name": "spare", 00:24:46.862 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:46.862 "is_configured": true, 00:24:46.862 "data_offset": 0, 00:24:46.862 "data_size": 65536 00:24:46.862 }, 00:24:46.862 { 00:24:46.862 "name": "BaseBdev2", 00:24:46.862 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:46.862 "is_configured": true, 00:24:46.862 "data_offset": 0, 00:24:46.862 "data_size": 65536 00:24:46.862 } 00:24:46.862 ] 00:24:46.862 }' 00:24:46.862 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.120 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:47.120 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.120 [2024-07-15 22:53:31.806334] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:47.120 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:47.120 22:53:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:47.120 [2024-07-15 22:53:32.008678] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:47.120 [2024-07-15 22:53:32.008814] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:47.687 [2024-07-15 22:53:32.357255] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.945 22:53:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.203 22:53:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.203 "name": "raid_bdev1", 00:24:48.203 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:48.203 "strip_size_kb": 0, 00:24:48.203 "state": "online", 00:24:48.203 "raid_level": "raid1", 00:24:48.203 "superblock": false, 00:24:48.203 "num_base_bdevs": 2, 00:24:48.203 "num_base_bdevs_discovered": 2, 00:24:48.203 "num_base_bdevs_operational": 2, 00:24:48.203 "process": { 00:24:48.203 "type": "rebuild", 00:24:48.203 "target": "spare", 00:24:48.203 "progress": { 00:24:48.203 "blocks": 53248, 00:24:48.203 "percent": 81 00:24:48.203 } 00:24:48.203 }, 00:24:48.203 "base_bdevs_list": [ 00:24:48.203 { 00:24:48.203 "name": "spare", 00:24:48.203 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:48.203 "is_configured": true, 00:24:48.203 "data_offset": 0, 00:24:48.203 "data_size": 65536 00:24:48.203 }, 00:24:48.203 { 00:24:48.203 "name": "BaseBdev2", 00:24:48.203 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:48.203 "is_configured": true, 00:24:48.203 "data_offset": 0, 00:24:48.203 "data_size": 65536 00:24:48.203 } 00:24:48.203 ] 00:24:48.203 }' 00:24:48.203 22:53:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.203 22:53:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.203 22:53:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.463 22:53:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.463 22:53:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:48.463 [2024-07-15 22:53:33.262783] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:24:48.721 [2024-07-15 22:53:33.600478] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:48.979 [2024-07-15 22:53:33.708714] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:48.979 [2024-07-15 22:53:33.711033] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:49.237 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:49.494 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.494 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.494 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.494 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.494 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.494 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.494 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.752 "name": "raid_bdev1", 00:24:49.752 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:49.752 "strip_size_kb": 0, 00:24:49.752 "state": "online", 00:24:49.752 "raid_level": "raid1", 00:24:49.752 "superblock": false, 00:24:49.752 "num_base_bdevs": 2, 00:24:49.752 "num_base_bdevs_discovered": 2, 00:24:49.752 "num_base_bdevs_operational": 2, 00:24:49.752 "base_bdevs_list": [ 00:24:49.752 { 00:24:49.752 "name": "spare", 00:24:49.752 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:49.752 "is_configured": true, 00:24:49.752 "data_offset": 0, 00:24:49.752 "data_size": 65536 00:24:49.752 }, 00:24:49.752 { 00:24:49.752 "name": "BaseBdev2", 00:24:49.752 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:49.752 "is_configured": true, 00:24:49.752 "data_offset": 0, 00:24:49.752 "data_size": 65536 00:24:49.752 } 00:24:49.752 ] 00:24:49.752 }' 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.752 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.010 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:50.010 "name": "raid_bdev1", 00:24:50.010 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:50.010 "strip_size_kb": 0, 00:24:50.010 "state": "online", 00:24:50.010 "raid_level": "raid1", 00:24:50.010 "superblock": false, 00:24:50.011 "num_base_bdevs": 2, 00:24:50.011 "num_base_bdevs_discovered": 2, 00:24:50.011 "num_base_bdevs_operational": 2, 00:24:50.011 "base_bdevs_list": [ 00:24:50.011 { 00:24:50.011 "name": "spare", 00:24:50.011 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:50.011 "is_configured": true, 00:24:50.011 "data_offset": 0, 00:24:50.011 "data_size": 65536 00:24:50.011 }, 00:24:50.011 { 00:24:50.011 "name": "BaseBdev2", 00:24:50.011 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:50.011 "is_configured": true, 00:24:50.011 "data_offset": 0, 00:24:50.011 "data_size": 65536 00:24:50.011 } 00:24:50.011 ] 00:24:50.011 }' 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.011 22:53:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.269 22:53:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.269 "name": "raid_bdev1", 00:24:50.269 "uuid": "f0dd5274-e5fd-4ad1-b432-a9c7b4210044", 00:24:50.269 "strip_size_kb": 0, 00:24:50.269 "state": "online", 00:24:50.269 "raid_level": "raid1", 00:24:50.269 "superblock": false, 00:24:50.269 "num_base_bdevs": 2, 00:24:50.269 "num_base_bdevs_discovered": 2, 00:24:50.269 "num_base_bdevs_operational": 2, 00:24:50.269 "base_bdevs_list": [ 00:24:50.269 { 00:24:50.269 "name": "spare", 00:24:50.269 "uuid": "608ae529-b818-5fcc-88c6-fdf32a325ce7", 00:24:50.269 "is_configured": true, 00:24:50.269 "data_offset": 0, 00:24:50.269 "data_size": 65536 00:24:50.269 }, 00:24:50.269 { 00:24:50.269 "name": "BaseBdev2", 00:24:50.269 "uuid": "62c89976-002a-59f8-957d-a5d3d373038f", 00:24:50.269 "is_configured": true, 00:24:50.269 "data_offset": 0, 00:24:50.269 "data_size": 65536 00:24:50.269 } 00:24:50.269 ] 00:24:50.269 }' 00:24:50.269 22:53:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.269 22:53:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:51.204 22:53:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:51.205 [2024-07-15 22:53:36.037752] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:51.205 [2024-07-15 22:53:36.037784] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:51.462 00:24:51.462 Latency(us) 00:24:51.462 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:51.462 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:51.462 raid_bdev1 : 12.01 95.32 285.97 0.00 0.00 13923.57 290.28 119446.48 00:24:51.462 =================================================================================================================== 00:24:51.462 Total : 95.32 285.97 0.00 0.00 13923.57 290.28 119446.48 00:24:51.462 [2024-07-15 22:53:36.141991] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.462 [2024-07-15 22:53:36.142019] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:51.462 [2024-07-15 22:53:36.142093] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:51.462 [2024-07-15 22:53:36.142105] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa7d070 name raid_bdev1, state offline 00:24:51.462 0 00:24:51.462 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.462 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:51.719 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:52.283 /dev/nbd0 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:52.283 22:53:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:52.283 1+0 records in 00:24:52.283 1+0 records out 00:24:52.283 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253106 s, 16.2 MB/s 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:52.283 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:52.541 /dev/nbd1 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:52.541 1+0 records in 00:24:52.541 1+0 records out 00:24:52.541 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292393 s, 14.0 MB/s 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:52.541 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:52.800 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2816589 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2816589 ']' 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2816589 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2816589 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2816589' 00:24:53.057 killing process with pid 2816589 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2816589 00:24:53.057 Received shutdown signal, test time was about 13.783812 seconds 00:24:53.057 00:24:53.057 Latency(us) 00:24:53.057 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:53.057 =================================================================================================================== 00:24:53.057 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:53.057 [2024-07-15 22:53:37.914185] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:53.057 22:53:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2816589 00:24:53.057 [2024-07-15 22:53:37.935632] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:53.315 22:53:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:53.315 00:24:53.315 real 0m18.514s 00:24:53.315 user 0m28.322s 00:24:53.315 sys 0m2.854s 00:24:53.315 22:53:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:53.315 22:53:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:53.315 ************************************ 00:24:53.315 END TEST raid_rebuild_test_io 00:24:53.315 ************************************ 00:24:53.315 22:53:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:53.315 22:53:38 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:24:53.315 22:53:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:53.315 22:53:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:53.315 22:53:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:53.574 ************************************ 00:24:53.574 START TEST raid_rebuild_test_sb_io 00:24:53.574 ************************************ 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2819211 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2819211 /var/tmp/spdk-raid.sock 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2819211 ']' 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:53.574 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:53.574 22:53:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:53.574 [2024-07-15 22:53:38.322714] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:24:53.574 [2024-07-15 22:53:38.322796] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2819211 ] 00:24:53.574 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:53.574 Zero copy mechanism will not be used. 00:24:53.574 [2024-07-15 22:53:38.464986] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:53.834 [2024-07-15 22:53:38.572278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:53.834 [2024-07-15 22:53:38.636001] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:53.834 [2024-07-15 22:53:38.636039] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:54.402 22:53:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:54.402 22:53:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:24:54.402 22:53:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:54.402 22:53:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:54.995 BaseBdev1_malloc 00:24:54.995 22:53:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:55.254 [2024-07-15 22:53:40.098013] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:55.254 [2024-07-15 22:53:40.098066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:55.254 [2024-07-15 22:53:40.098090] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde4d40 00:24:55.254 [2024-07-15 22:53:40.098103] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:55.254 [2024-07-15 22:53:40.099947] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:55.254 [2024-07-15 22:53:40.099979] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:55.254 BaseBdev1 00:24:55.254 22:53:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:55.254 22:53:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:55.513 BaseBdev2_malloc 00:24:55.514 22:53:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:55.773 [2024-07-15 22:53:40.641533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:55.773 [2024-07-15 22:53:40.641582] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:55.773 [2024-07-15 22:53:40.641606] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xde5860 00:24:55.773 [2024-07-15 22:53:40.641618] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:55.773 [2024-07-15 22:53:40.643204] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:55.773 [2024-07-15 22:53:40.643234] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:55.773 BaseBdev2 00:24:55.773 22:53:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:56.342 spare_malloc 00:24:56.342 22:53:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:56.602 spare_delay 00:24:56.602 22:53:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:56.861 [2024-07-15 22:53:41.696919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:56.861 [2024-07-15 22:53:41.696968] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:56.861 [2024-07-15 22:53:41.696989] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf93ec0 00:24:56.861 [2024-07-15 22:53:41.697002] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:56.861 [2024-07-15 22:53:41.698569] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:56.861 [2024-07-15 22:53:41.698600] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:56.861 spare 00:24:56.861 22:53:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:57.120 [2024-07-15 22:53:41.985709] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:57.120 [2024-07-15 22:53:41.987003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:57.120 [2024-07-15 22:53:41.987172] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf95070 00:24:57.120 [2024-07-15 22:53:41.987185] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:57.120 [2024-07-15 22:53:41.987379] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf8e490 00:24:57.120 [2024-07-15 22:53:41.987522] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf95070 00:24:57.120 [2024-07-15 22:53:41.987532] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf95070 00:24:57.120 [2024-07-15 22:53:41.987630] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.120 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.378 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:57.378 "name": "raid_bdev1", 00:24:57.378 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:24:57.378 "strip_size_kb": 0, 00:24:57.378 "state": "online", 00:24:57.378 "raid_level": "raid1", 00:24:57.378 "superblock": true, 00:24:57.378 "num_base_bdevs": 2, 00:24:57.378 "num_base_bdevs_discovered": 2, 00:24:57.378 "num_base_bdevs_operational": 2, 00:24:57.378 "base_bdevs_list": [ 00:24:57.378 { 00:24:57.378 "name": "BaseBdev1", 00:24:57.378 "uuid": "9eae2f40-e2a8-5a90-a270-b412b235d75c", 00:24:57.378 "is_configured": true, 00:24:57.378 "data_offset": 2048, 00:24:57.378 "data_size": 63488 00:24:57.378 }, 00:24:57.378 { 00:24:57.378 "name": "BaseBdev2", 00:24:57.378 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:24:57.378 "is_configured": true, 00:24:57.378 "data_offset": 2048, 00:24:57.378 "data_size": 63488 00:24:57.378 } 00:24:57.378 ] 00:24:57.378 }' 00:24:57.378 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:57.378 22:53:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:58.311 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:58.311 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:58.568 [2024-07-15 22:53:43.345520] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:58.568 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:58.568 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.568 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:58.826 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:58.826 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:58.826 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:58.826 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:59.085 [2024-07-15 22:53:43.836659] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf95c50 00:24:59.085 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:59.085 Zero copy mechanism will not be used. 00:24:59.085 Running I/O for 60 seconds... 00:24:59.085 [2024-07-15 22:53:43.858439] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:59.085 [2024-07-15 22:53:43.858635] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf95c50 00:24:59.085 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:59.085 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:59.085 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:59.085 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.086 22:53:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.407 22:53:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.407 "name": "raid_bdev1", 00:24:59.407 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:24:59.407 "strip_size_kb": 0, 00:24:59.407 "state": "online", 00:24:59.407 "raid_level": "raid1", 00:24:59.407 "superblock": true, 00:24:59.407 "num_base_bdevs": 2, 00:24:59.407 "num_base_bdevs_discovered": 1, 00:24:59.407 "num_base_bdevs_operational": 1, 00:24:59.407 "base_bdevs_list": [ 00:24:59.407 { 00:24:59.407 "name": null, 00:24:59.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.407 "is_configured": false, 00:24:59.407 "data_offset": 2048, 00:24:59.407 "data_size": 63488 00:24:59.407 }, 00:24:59.407 { 00:24:59.407 "name": "BaseBdev2", 00:24:59.407 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:24:59.407 "is_configured": true, 00:24:59.407 "data_offset": 2048, 00:24:59.407 "data_size": 63488 00:24:59.407 } 00:24:59.407 ] 00:24:59.407 }' 00:24:59.407 22:53:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.407 22:53:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:59.976 22:53:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:00.236 [2024-07-15 22:53:44.932110] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:00.236 22:53:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:00.236 [2024-07-15 22:53:44.991778] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf01230 00:25:00.236 [2024-07-15 22:53:44.994187] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:00.236 [2024-07-15 22:53:45.124402] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:00.495 [2024-07-15 22:53:45.252074] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:00.495 [2024-07-15 22:53:45.252337] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:00.753 [2024-07-15 22:53:45.658176] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:01.321 22:53:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.321 22:53:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.321 22:53:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.321 22:53:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.321 22:53:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.321 [2024-07-15 22:53:45.974055] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:01.321 22:53:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.321 22:53:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.321 [2024-07-15 22:53:46.093215] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:01.581 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.581 "name": "raid_bdev1", 00:25:01.581 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:01.581 "strip_size_kb": 0, 00:25:01.581 "state": "online", 00:25:01.581 "raid_level": "raid1", 00:25:01.581 "superblock": true, 00:25:01.581 "num_base_bdevs": 2, 00:25:01.581 "num_base_bdevs_discovered": 2, 00:25:01.581 "num_base_bdevs_operational": 2, 00:25:01.581 "process": { 00:25:01.581 "type": "rebuild", 00:25:01.581 "target": "spare", 00:25:01.581 "progress": { 00:25:01.581 "blocks": 16384, 00:25:01.581 "percent": 25 00:25:01.581 } 00:25:01.581 }, 00:25:01.581 "base_bdevs_list": [ 00:25:01.581 { 00:25:01.581 "name": "spare", 00:25:01.581 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:01.581 "is_configured": true, 00:25:01.581 "data_offset": 2048, 00:25:01.581 "data_size": 63488 00:25:01.581 }, 00:25:01.581 { 00:25:01.581 "name": "BaseBdev2", 00:25:01.581 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:01.581 "is_configured": true, 00:25:01.581 "data_offset": 2048, 00:25:01.581 "data_size": 63488 00:25:01.581 } 00:25:01.581 ] 00:25:01.581 }' 00:25:01.581 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.581 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.581 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.581 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.581 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:01.581 [2024-07-15 22:53:46.442665] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:01.840 [2024-07-15 22:53:46.560763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:01.840 [2024-07-15 22:53:46.578160] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:01.840 [2024-07-15 22:53:46.695887] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:01.840 [2024-07-15 22:53:46.714434] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:01.840 [2024-07-15 22:53:46.714462] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:01.840 [2024-07-15 22:53:46.714473] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:01.840 [2024-07-15 22:53:46.745172] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xf95c50 00:25:02.097 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:02.097 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:02.097 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.097 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.097 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.097 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:02.097 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.098 "name": "raid_bdev1", 00:25:02.098 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:02.098 "strip_size_kb": 0, 00:25:02.098 "state": "online", 00:25:02.098 "raid_level": "raid1", 00:25:02.098 "superblock": true, 00:25:02.098 "num_base_bdevs": 2, 00:25:02.098 "num_base_bdevs_discovered": 1, 00:25:02.098 "num_base_bdevs_operational": 1, 00:25:02.098 "base_bdevs_list": [ 00:25:02.098 { 00:25:02.098 "name": null, 00:25:02.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.098 "is_configured": false, 00:25:02.098 "data_offset": 2048, 00:25:02.098 "data_size": 63488 00:25:02.098 }, 00:25:02.098 { 00:25:02.098 "name": "BaseBdev2", 00:25:02.098 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:02.098 "is_configured": true, 00:25:02.098 "data_offset": 2048, 00:25:02.098 "data_size": 63488 00:25:02.098 } 00:25:02.098 ] 00:25:02.098 }' 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.098 22:53:46 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.034 "name": "raid_bdev1", 00:25:03.034 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:03.034 "strip_size_kb": 0, 00:25:03.034 "state": "online", 00:25:03.034 "raid_level": "raid1", 00:25:03.034 "superblock": true, 00:25:03.034 "num_base_bdevs": 2, 00:25:03.034 "num_base_bdevs_discovered": 1, 00:25:03.034 "num_base_bdevs_operational": 1, 00:25:03.034 "base_bdevs_list": [ 00:25:03.034 { 00:25:03.034 "name": null, 00:25:03.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.034 "is_configured": false, 00:25:03.034 "data_offset": 2048, 00:25:03.034 "data_size": 63488 00:25:03.034 }, 00:25:03.034 { 00:25:03.034 "name": "BaseBdev2", 00:25:03.034 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:03.034 "is_configured": true, 00:25:03.034 "data_offset": 2048, 00:25:03.034 "data_size": 63488 00:25:03.034 } 00:25:03.034 ] 00:25:03.034 }' 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:03.034 22:53:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:03.293 [2024-07-15 22:53:48.188147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:03.552 22:53:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:03.552 [2024-07-15 22:53:48.248255] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf95e60 00:25:03.552 [2024-07-15 22:53:48.249802] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:03.552 [2024-07-15 22:53:48.376975] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:03.552 [2024-07-15 22:53:48.377322] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:03.810 [2024-07-15 22:53:48.605332] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:03.810 [2024-07-15 22:53:48.605603] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:04.069 [2024-07-15 22:53:48.948215] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:04.069 [2024-07-15 22:53:48.948526] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:04.328 [2024-07-15 22:53:49.058916] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:04.586 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:04.587 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.587 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:04.587 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:04.587 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.587 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.587 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.587 [2024-07-15 22:53:49.383580] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:04.846 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.846 "name": "raid_bdev1", 00:25:04.846 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:04.846 "strip_size_kb": 0, 00:25:04.846 "state": "online", 00:25:04.846 "raid_level": "raid1", 00:25:04.846 "superblock": true, 00:25:04.846 "num_base_bdevs": 2, 00:25:04.846 "num_base_bdevs_discovered": 2, 00:25:04.846 "num_base_bdevs_operational": 2, 00:25:04.846 "process": { 00:25:04.846 "type": "rebuild", 00:25:04.846 "target": "spare", 00:25:04.846 "progress": { 00:25:04.846 "blocks": 14336, 00:25:04.846 "percent": 22 00:25:04.846 } 00:25:04.846 }, 00:25:04.846 "base_bdevs_list": [ 00:25:04.846 { 00:25:04.846 "name": "spare", 00:25:04.846 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:04.846 "is_configured": true, 00:25:04.846 "data_offset": 2048, 00:25:04.846 "data_size": 63488 00:25:04.846 }, 00:25:04.846 { 00:25:04.846 "name": "BaseBdev2", 00:25:04.846 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:04.846 "is_configured": true, 00:25:04.846 "data_offset": 2048, 00:25:04.846 "data_size": 63488 00:25:04.846 } 00:25:04.846 ] 00:25:04.846 }' 00:25:04.846 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.846 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:04.847 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=877 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.847 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.847 [2024-07-15 22:53:49.604458] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:05.106 [2024-07-15 22:53:49.835046] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:05.106 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:05.106 "name": "raid_bdev1", 00:25:05.106 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:05.106 "strip_size_kb": 0, 00:25:05.106 "state": "online", 00:25:05.106 "raid_level": "raid1", 00:25:05.106 "superblock": true, 00:25:05.106 "num_base_bdevs": 2, 00:25:05.106 "num_base_bdevs_discovered": 2, 00:25:05.106 "num_base_bdevs_operational": 2, 00:25:05.106 "process": { 00:25:05.106 "type": "rebuild", 00:25:05.106 "target": "spare", 00:25:05.106 "progress": { 00:25:05.106 "blocks": 18432, 00:25:05.106 "percent": 29 00:25:05.106 } 00:25:05.106 }, 00:25:05.106 "base_bdevs_list": [ 00:25:05.106 { 00:25:05.106 "name": "spare", 00:25:05.106 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:05.106 "is_configured": true, 00:25:05.106 "data_offset": 2048, 00:25:05.106 "data_size": 63488 00:25:05.106 }, 00:25:05.106 { 00:25:05.106 "name": "BaseBdev2", 00:25:05.106 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:05.106 "is_configured": true, 00:25:05.106 "data_offset": 2048, 00:25:05.106 "data_size": 63488 00:25:05.106 } 00:25:05.106 ] 00:25:05.106 }' 00:25:05.106 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:05.106 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:05.106 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:05.106 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:05.106 22:53:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:05.365 [2024-07-15 22:53:50.062595] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:05.934 [2024-07-15 22:53:50.788768] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.192 22:53:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.192 [2024-07-15 22:53:51.007574] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:06.451 22:53:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.451 "name": "raid_bdev1", 00:25:06.451 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:06.451 "strip_size_kb": 0, 00:25:06.451 "state": "online", 00:25:06.451 "raid_level": "raid1", 00:25:06.451 "superblock": true, 00:25:06.451 "num_base_bdevs": 2, 00:25:06.451 "num_base_bdevs_discovered": 2, 00:25:06.451 "num_base_bdevs_operational": 2, 00:25:06.451 "process": { 00:25:06.451 "type": "rebuild", 00:25:06.451 "target": "spare", 00:25:06.451 "progress": { 00:25:06.451 "blocks": 34816, 00:25:06.451 "percent": 54 00:25:06.452 } 00:25:06.452 }, 00:25:06.452 "base_bdevs_list": [ 00:25:06.452 { 00:25:06.452 "name": "spare", 00:25:06.452 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:06.452 "is_configured": true, 00:25:06.452 "data_offset": 2048, 00:25:06.452 "data_size": 63488 00:25:06.452 }, 00:25:06.452 { 00:25:06.452 "name": "BaseBdev2", 00:25:06.452 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:06.452 "is_configured": true, 00:25:06.452 "data_offset": 2048, 00:25:06.452 "data_size": 63488 00:25:06.452 } 00:25:06.452 ] 00:25:06.452 }' 00:25:06.452 22:53:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.452 22:53:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:06.452 22:53:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.452 22:53:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:06.452 22:53:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:06.452 [2024-07-15 22:53:51.344680] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:06.710 [2024-07-15 22:53:51.564649] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:25:07.278 [2024-07-15 22:53:51.930095] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:25:07.538 [2024-07-15 22:53:52.269122] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.538 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.538 [2024-07-15 22:53:52.378884] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:25:07.798 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.798 "name": "raid_bdev1", 00:25:07.798 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:07.798 "strip_size_kb": 0, 00:25:07.798 "state": "online", 00:25:07.798 "raid_level": "raid1", 00:25:07.798 "superblock": true, 00:25:07.798 "num_base_bdevs": 2, 00:25:07.798 "num_base_bdevs_discovered": 2, 00:25:07.798 "num_base_bdevs_operational": 2, 00:25:07.798 "process": { 00:25:07.798 "type": "rebuild", 00:25:07.798 "target": "spare", 00:25:07.798 "progress": { 00:25:07.798 "blocks": 53248, 00:25:07.798 "percent": 83 00:25:07.798 } 00:25:07.798 }, 00:25:07.798 "base_bdevs_list": [ 00:25:07.798 { 00:25:07.798 "name": "spare", 00:25:07.798 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:07.798 "is_configured": true, 00:25:07.798 "data_offset": 2048, 00:25:07.798 "data_size": 63488 00:25:07.798 }, 00:25:07.798 { 00:25:07.798 "name": "BaseBdev2", 00:25:07.798 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:07.798 "is_configured": true, 00:25:07.798 "data_offset": 2048, 00:25:07.798 "data_size": 63488 00:25:07.798 } 00:25:07.798 ] 00:25:07.798 }' 00:25:07.798 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.798 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:07.798 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.798 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:07.798 22:53:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:08.058 [2024-07-15 22:53:52.826312] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:25:08.317 [2024-07-15 22:53:53.155967] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:08.575 [2024-07-15 22:53:53.264220] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:08.575 [2024-07-15 22:53:53.266812] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.832 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.090 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.090 "name": "raid_bdev1", 00:25:09.090 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:09.090 "strip_size_kb": 0, 00:25:09.090 "state": "online", 00:25:09.090 "raid_level": "raid1", 00:25:09.090 "superblock": true, 00:25:09.091 "num_base_bdevs": 2, 00:25:09.091 "num_base_bdevs_discovered": 2, 00:25:09.091 "num_base_bdevs_operational": 2, 00:25:09.091 "base_bdevs_list": [ 00:25:09.091 { 00:25:09.091 "name": "spare", 00:25:09.091 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:09.091 "is_configured": true, 00:25:09.091 "data_offset": 2048, 00:25:09.091 "data_size": 63488 00:25:09.091 }, 00:25:09.091 { 00:25:09.091 "name": "BaseBdev2", 00:25:09.091 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:09.091 "is_configured": true, 00:25:09.091 "data_offset": 2048, 00:25:09.091 "data_size": 63488 00:25:09.091 } 00:25:09.091 ] 00:25:09.091 }' 00:25:09.091 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.091 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:09.091 22:53:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.349 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.607 "name": "raid_bdev1", 00:25:09.607 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:09.607 "strip_size_kb": 0, 00:25:09.607 "state": "online", 00:25:09.607 "raid_level": "raid1", 00:25:09.607 "superblock": true, 00:25:09.607 "num_base_bdevs": 2, 00:25:09.607 "num_base_bdevs_discovered": 2, 00:25:09.607 "num_base_bdevs_operational": 2, 00:25:09.607 "base_bdevs_list": [ 00:25:09.607 { 00:25:09.607 "name": "spare", 00:25:09.607 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:09.607 "is_configured": true, 00:25:09.607 "data_offset": 2048, 00:25:09.607 "data_size": 63488 00:25:09.607 }, 00:25:09.607 { 00:25:09.607 "name": "BaseBdev2", 00:25:09.607 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:09.607 "is_configured": true, 00:25:09.607 "data_offset": 2048, 00:25:09.607 "data_size": 63488 00:25:09.607 } 00:25:09.607 ] 00:25:09.607 }' 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.607 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.865 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:09.865 "name": "raid_bdev1", 00:25:09.865 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:09.865 "strip_size_kb": 0, 00:25:09.865 "state": "online", 00:25:09.865 "raid_level": "raid1", 00:25:09.865 "superblock": true, 00:25:09.865 "num_base_bdevs": 2, 00:25:09.865 "num_base_bdevs_discovered": 2, 00:25:09.865 "num_base_bdevs_operational": 2, 00:25:09.865 "base_bdevs_list": [ 00:25:09.865 { 00:25:09.865 "name": "spare", 00:25:09.865 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:09.865 "is_configured": true, 00:25:09.865 "data_offset": 2048, 00:25:09.865 "data_size": 63488 00:25:09.865 }, 00:25:09.865 { 00:25:09.865 "name": "BaseBdev2", 00:25:09.865 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:09.865 "is_configured": true, 00:25:09.865 "data_offset": 2048, 00:25:09.865 "data_size": 63488 00:25:09.865 } 00:25:09.865 ] 00:25:09.865 }' 00:25:09.865 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:09.865 22:53:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:10.431 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:10.689 [2024-07-15 22:53:55.448536] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:10.689 [2024-07-15 22:53:55.448566] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:10.689 00:25:10.689 Latency(us) 00:25:10.689 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:10.689 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:10.689 raid_bdev1 : 11.60 89.85 269.55 0.00 0.00 14711.93 299.19 119446.48 00:25:10.689 =================================================================================================================== 00:25:10.689 Total : 89.85 269.55 0.00 0.00 14711.93 299.19 119446.48 00:25:10.689 [2024-07-15 22:53:55.468506] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:10.689 [2024-07-15 22:53:55.468533] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:10.689 [2024-07-15 22:53:55.468606] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:10.690 [2024-07-15 22:53:55.468618] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf95070 name raid_bdev1, state offline 00:25:10.690 0 00:25:10.690 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.690 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:10.947 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:11.205 /dev/nbd0 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:11.205 1+0 records in 00:25:11.205 1+0 records out 00:25:11.205 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230479 s, 17.8 MB/s 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:11.205 22:53:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:11.463 /dev/nbd1 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:11.463 1+0 records in 00:25:11.463 1+0 records out 00:25:11.463 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000296414 s, 13.8 MB/s 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:11.463 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:11.721 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:12.286 22:53:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:12.286 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:12.544 [2024-07-15 22:53:57.303250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:12.544 [2024-07-15 22:53:57.303294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:12.544 [2024-07-15 22:53:57.303317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf92090 00:25:12.544 [2024-07-15 22:53:57.303330] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:12.544 [2024-07-15 22:53:57.304959] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:12.544 [2024-07-15 22:53:57.304990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:12.544 [2024-07-15 22:53:57.305068] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:12.544 [2024-07-15 22:53:57.305094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:12.544 [2024-07-15 22:53:57.305192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:12.544 spare 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.544 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.544 [2024-07-15 22:53:57.405519] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xde3f70 00:25:12.544 [2024-07-15 22:53:57.405536] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:12.544 [2024-07-15 22:53:57.405719] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf8df50 00:25:12.544 [2024-07-15 22:53:57.405861] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xde3f70 00:25:12.544 [2024-07-15 22:53:57.405872] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xde3f70 00:25:12.544 [2024-07-15 22:53:57.405987] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.802 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.802 "name": "raid_bdev1", 00:25:12.802 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:12.802 "strip_size_kb": 0, 00:25:12.802 "state": "online", 00:25:12.802 "raid_level": "raid1", 00:25:12.802 "superblock": true, 00:25:12.802 "num_base_bdevs": 2, 00:25:12.802 "num_base_bdevs_discovered": 2, 00:25:12.802 "num_base_bdevs_operational": 2, 00:25:12.802 "base_bdevs_list": [ 00:25:12.802 { 00:25:12.802 "name": "spare", 00:25:12.802 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:12.802 "is_configured": true, 00:25:12.802 "data_offset": 2048, 00:25:12.802 "data_size": 63488 00:25:12.802 }, 00:25:12.802 { 00:25:12.802 "name": "BaseBdev2", 00:25:12.802 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:12.802 "is_configured": true, 00:25:12.802 "data_offset": 2048, 00:25:12.802 "data_size": 63488 00:25:12.802 } 00:25:12.802 ] 00:25:12.802 }' 00:25:12.802 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.802 22:53:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:13.369 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.369 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.369 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.369 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.369 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.369 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.369 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.664 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.664 "name": "raid_bdev1", 00:25:13.664 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:13.664 "strip_size_kb": 0, 00:25:13.664 "state": "online", 00:25:13.664 "raid_level": "raid1", 00:25:13.664 "superblock": true, 00:25:13.664 "num_base_bdevs": 2, 00:25:13.664 "num_base_bdevs_discovered": 2, 00:25:13.664 "num_base_bdevs_operational": 2, 00:25:13.664 "base_bdevs_list": [ 00:25:13.664 { 00:25:13.664 "name": "spare", 00:25:13.664 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:13.664 "is_configured": true, 00:25:13.664 "data_offset": 2048, 00:25:13.664 "data_size": 63488 00:25:13.664 }, 00:25:13.664 { 00:25:13.664 "name": "BaseBdev2", 00:25:13.664 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:13.664 "is_configured": true, 00:25:13.664 "data_offset": 2048, 00:25:13.664 "data_size": 63488 00:25:13.664 } 00:25:13.664 ] 00:25:13.664 }' 00:25:13.664 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.664 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.664 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.664 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.664 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.664 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:13.953 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:13.953 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:13.953 [2024-07-15 22:53:58.851688] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.213 22:53:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.472 22:53:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.472 "name": "raid_bdev1", 00:25:14.472 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:14.472 "strip_size_kb": 0, 00:25:14.472 "state": "online", 00:25:14.472 "raid_level": "raid1", 00:25:14.472 "superblock": true, 00:25:14.472 "num_base_bdevs": 2, 00:25:14.472 "num_base_bdevs_discovered": 1, 00:25:14.472 "num_base_bdevs_operational": 1, 00:25:14.472 "base_bdevs_list": [ 00:25:14.472 { 00:25:14.472 "name": null, 00:25:14.472 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.472 "is_configured": false, 00:25:14.472 "data_offset": 2048, 00:25:14.472 "data_size": 63488 00:25:14.472 }, 00:25:14.472 { 00:25:14.472 "name": "BaseBdev2", 00:25:14.472 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:14.472 "is_configured": true, 00:25:14.472 "data_offset": 2048, 00:25:14.472 "data_size": 63488 00:25:14.472 } 00:25:14.472 ] 00:25:14.472 }' 00:25:14.472 22:53:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.472 22:53:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:15.039 22:53:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:15.039 [2024-07-15 22:53:59.946764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:15.039 [2024-07-15 22:53:59.946916] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:15.039 [2024-07-15 22:53:59.946947] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:15.039 [2024-07-15 22:53:59.946973] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:15.298 [2024-07-15 22:53:59.952199] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf8df50 00:25:15.298 [2024-07-15 22:53:59.954647] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:15.298 22:53:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:16.233 22:54:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:16.233 22:54:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.233 22:54:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:16.233 22:54:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:16.233 22:54:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.233 22:54:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.233 22:54:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.492 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:16.492 "name": "raid_bdev1", 00:25:16.492 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:16.492 "strip_size_kb": 0, 00:25:16.492 "state": "online", 00:25:16.492 "raid_level": "raid1", 00:25:16.492 "superblock": true, 00:25:16.492 "num_base_bdevs": 2, 00:25:16.492 "num_base_bdevs_discovered": 2, 00:25:16.492 "num_base_bdevs_operational": 2, 00:25:16.492 "process": { 00:25:16.492 "type": "rebuild", 00:25:16.492 "target": "spare", 00:25:16.492 "progress": { 00:25:16.493 "blocks": 22528, 00:25:16.493 "percent": 35 00:25:16.493 } 00:25:16.493 }, 00:25:16.493 "base_bdevs_list": [ 00:25:16.493 { 00:25:16.493 "name": "spare", 00:25:16.493 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:16.493 "is_configured": true, 00:25:16.493 "data_offset": 2048, 00:25:16.493 "data_size": 63488 00:25:16.493 }, 00:25:16.493 { 00:25:16.493 "name": "BaseBdev2", 00:25:16.493 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:16.493 "is_configured": true, 00:25:16.493 "data_offset": 2048, 00:25:16.493 "data_size": 63488 00:25:16.493 } 00:25:16.493 ] 00:25:16.493 }' 00:25:16.493 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:16.493 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:16.493 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:16.493 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:16.493 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:16.751 [2024-07-15 22:54:01.493827] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:16.751 [2024-07-15 22:54:01.567383] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:16.751 [2024-07-15 22:54:01.567440] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:16.751 [2024-07-15 22:54:01.567456] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:16.751 [2024-07-15 22:54:01.567465] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.751 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.009 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.009 "name": "raid_bdev1", 00:25:17.009 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:17.009 "strip_size_kb": 0, 00:25:17.009 "state": "online", 00:25:17.009 "raid_level": "raid1", 00:25:17.009 "superblock": true, 00:25:17.009 "num_base_bdevs": 2, 00:25:17.009 "num_base_bdevs_discovered": 1, 00:25:17.009 "num_base_bdevs_operational": 1, 00:25:17.009 "base_bdevs_list": [ 00:25:17.009 { 00:25:17.009 "name": null, 00:25:17.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.009 "is_configured": false, 00:25:17.009 "data_offset": 2048, 00:25:17.009 "data_size": 63488 00:25:17.009 }, 00:25:17.009 { 00:25:17.009 "name": "BaseBdev2", 00:25:17.009 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:17.009 "is_configured": true, 00:25:17.009 "data_offset": 2048, 00:25:17.009 "data_size": 63488 00:25:17.009 } 00:25:17.009 ] 00:25:17.009 }' 00:25:17.009 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.009 22:54:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:17.575 22:54:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:18.141 [2024-07-15 22:54:02.952319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:18.141 [2024-07-15 22:54:02.952372] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:18.141 [2024-07-15 22:54:02.952396] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf7c50 00:25:18.141 [2024-07-15 22:54:02.952408] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:18.141 [2024-07-15 22:54:02.952798] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:18.141 [2024-07-15 22:54:02.952819] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:18.141 [2024-07-15 22:54:02.952910] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:18.141 [2024-07-15 22:54:02.952924] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:18.141 [2024-07-15 22:54:02.952949] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:18.141 [2024-07-15 22:54:02.952969] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:18.141 [2024-07-15 22:54:02.959005] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf973d0 00:25:18.141 spare 00:25:18.141 [2024-07-15 22:54:02.960551] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:18.141 22:54:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:19.515 22:54:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:19.515 22:54:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.515 22:54:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:19.515 22:54:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:19.515 22:54:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.516 22:54:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.516 22:54:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.516 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:19.516 "name": "raid_bdev1", 00:25:19.516 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:19.516 "strip_size_kb": 0, 00:25:19.516 "state": "online", 00:25:19.516 "raid_level": "raid1", 00:25:19.516 "superblock": true, 00:25:19.516 "num_base_bdevs": 2, 00:25:19.516 "num_base_bdevs_discovered": 2, 00:25:19.516 "num_base_bdevs_operational": 2, 00:25:19.516 "process": { 00:25:19.516 "type": "rebuild", 00:25:19.516 "target": "spare", 00:25:19.516 "progress": { 00:25:19.516 "blocks": 24576, 00:25:19.516 "percent": 38 00:25:19.516 } 00:25:19.516 }, 00:25:19.516 "base_bdevs_list": [ 00:25:19.516 { 00:25:19.516 "name": "spare", 00:25:19.516 "uuid": "efcd2890-11c1-5c41-bd0a-76817361f347", 00:25:19.516 "is_configured": true, 00:25:19.516 "data_offset": 2048, 00:25:19.516 "data_size": 63488 00:25:19.516 }, 00:25:19.516 { 00:25:19.516 "name": "BaseBdev2", 00:25:19.516 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:19.516 "is_configured": true, 00:25:19.516 "data_offset": 2048, 00:25:19.516 "data_size": 63488 00:25:19.516 } 00:25:19.516 ] 00:25:19.516 }' 00:25:19.516 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:19.516 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:19.516 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:19.516 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:19.516 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:19.774 [2024-07-15 22:54:04.568207] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:19.774 [2024-07-15 22:54:04.573567] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:19.774 [2024-07-15 22:54:04.573614] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:19.774 [2024-07-15 22:54:04.573629] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:19.774 [2024-07-15 22:54:04.573638] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.774 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.032 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.032 "name": "raid_bdev1", 00:25:20.032 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:20.032 "strip_size_kb": 0, 00:25:20.032 "state": "online", 00:25:20.032 "raid_level": "raid1", 00:25:20.032 "superblock": true, 00:25:20.032 "num_base_bdevs": 2, 00:25:20.032 "num_base_bdevs_discovered": 1, 00:25:20.032 "num_base_bdevs_operational": 1, 00:25:20.032 "base_bdevs_list": [ 00:25:20.032 { 00:25:20.032 "name": null, 00:25:20.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.032 "is_configured": false, 00:25:20.032 "data_offset": 2048, 00:25:20.032 "data_size": 63488 00:25:20.032 }, 00:25:20.032 { 00:25:20.032 "name": "BaseBdev2", 00:25:20.032 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:20.032 "is_configured": true, 00:25:20.032 "data_offset": 2048, 00:25:20.032 "data_size": 63488 00:25:20.032 } 00:25:20.032 ] 00:25:20.032 }' 00:25:20.032 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.032 22:54:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:20.598 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:20.598 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:20.598 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:20.598 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:20.598 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:20.598 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.598 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.857 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:20.857 "name": "raid_bdev1", 00:25:20.857 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:20.857 "strip_size_kb": 0, 00:25:20.857 "state": "online", 00:25:20.857 "raid_level": "raid1", 00:25:20.857 "superblock": true, 00:25:20.857 "num_base_bdevs": 2, 00:25:20.857 "num_base_bdevs_discovered": 1, 00:25:20.857 "num_base_bdevs_operational": 1, 00:25:20.857 "base_bdevs_list": [ 00:25:20.857 { 00:25:20.857 "name": null, 00:25:20.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.857 "is_configured": false, 00:25:20.857 "data_offset": 2048, 00:25:20.857 "data_size": 63488 00:25:20.857 }, 00:25:20.857 { 00:25:20.857 "name": "BaseBdev2", 00:25:20.857 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:20.857 "is_configured": true, 00:25:20.857 "data_offset": 2048, 00:25:20.857 "data_size": 63488 00:25:20.857 } 00:25:20.857 ] 00:25:20.857 }' 00:25:20.857 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:21.115 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:21.115 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:21.115 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:21.115 22:54:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:21.372 22:54:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:21.628 [2024-07-15 22:54:06.535743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:21.628 [2024-07-15 22:54:06.535788] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.628 [2024-07-15 22:54:06.535809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf67b0 00:25:21.628 [2024-07-15 22:54:06.535821] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.628 [2024-07-15 22:54:06.536167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.628 [2024-07-15 22:54:06.536188] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:21.628 [2024-07-15 22:54:06.536253] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:21.628 [2024-07-15 22:54:06.536266] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:21.628 [2024-07-15 22:54:06.536277] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:21.886 BaseBdev1 00:25:21.886 22:54:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.824 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.084 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.084 "name": "raid_bdev1", 00:25:23.084 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:23.084 "strip_size_kb": 0, 00:25:23.084 "state": "online", 00:25:23.084 "raid_level": "raid1", 00:25:23.084 "superblock": true, 00:25:23.084 "num_base_bdevs": 2, 00:25:23.084 "num_base_bdevs_discovered": 1, 00:25:23.084 "num_base_bdevs_operational": 1, 00:25:23.084 "base_bdevs_list": [ 00:25:23.084 { 00:25:23.084 "name": null, 00:25:23.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.084 "is_configured": false, 00:25:23.084 "data_offset": 2048, 00:25:23.084 "data_size": 63488 00:25:23.084 }, 00:25:23.084 { 00:25:23.084 "name": "BaseBdev2", 00:25:23.084 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:23.084 "is_configured": true, 00:25:23.084 "data_offset": 2048, 00:25:23.084 "data_size": 63488 00:25:23.084 } 00:25:23.084 ] 00:25:23.084 }' 00:25:23.084 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.084 22:54:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:23.651 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:23.651 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:23.651 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:23.651 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:23.651 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:23.651 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.651 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:23.910 "name": "raid_bdev1", 00:25:23.910 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:23.910 "strip_size_kb": 0, 00:25:23.910 "state": "online", 00:25:23.910 "raid_level": "raid1", 00:25:23.910 "superblock": true, 00:25:23.910 "num_base_bdevs": 2, 00:25:23.910 "num_base_bdevs_discovered": 1, 00:25:23.910 "num_base_bdevs_operational": 1, 00:25:23.910 "base_bdevs_list": [ 00:25:23.910 { 00:25:23.910 "name": null, 00:25:23.910 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.910 "is_configured": false, 00:25:23.910 "data_offset": 2048, 00:25:23.910 "data_size": 63488 00:25:23.910 }, 00:25:23.910 { 00:25:23.910 "name": "BaseBdev2", 00:25:23.910 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:23.910 "is_configured": true, 00:25:23.910 "data_offset": 2048, 00:25:23.910 "data_size": 63488 00:25:23.910 } 00:25:23.910 ] 00:25:23.910 }' 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:23.910 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:24.167 [2024-07-15 22:54:08.914376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:24.168 [2024-07-15 22:54:08.914515] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:24.168 [2024-07-15 22:54:08.914531] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:24.168 request: 00:25:24.168 { 00:25:24.168 "base_bdev": "BaseBdev1", 00:25:24.168 "raid_bdev": "raid_bdev1", 00:25:24.168 "method": "bdev_raid_add_base_bdev", 00:25:24.168 "req_id": 1 00:25:24.168 } 00:25:24.168 Got JSON-RPC error response 00:25:24.168 response: 00:25:24.168 { 00:25:24.168 "code": -22, 00:25:24.168 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:24.168 } 00:25:24.168 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:25:24.168 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:24.168 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:24.168 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:24.168 22:54:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.099 22:54:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.357 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.357 "name": "raid_bdev1", 00:25:25.357 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:25.357 "strip_size_kb": 0, 00:25:25.357 "state": "online", 00:25:25.357 "raid_level": "raid1", 00:25:25.357 "superblock": true, 00:25:25.357 "num_base_bdevs": 2, 00:25:25.357 "num_base_bdevs_discovered": 1, 00:25:25.357 "num_base_bdevs_operational": 1, 00:25:25.357 "base_bdevs_list": [ 00:25:25.357 { 00:25:25.357 "name": null, 00:25:25.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.357 "is_configured": false, 00:25:25.357 "data_offset": 2048, 00:25:25.357 "data_size": 63488 00:25:25.357 }, 00:25:25.357 { 00:25:25.357 "name": "BaseBdev2", 00:25:25.357 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:25.357 "is_configured": true, 00:25:25.357 "data_offset": 2048, 00:25:25.357 "data_size": 63488 00:25:25.357 } 00:25:25.357 ] 00:25:25.357 }' 00:25:25.357 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.357 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:25.925 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:25.925 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.925 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:25.925 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:25.925 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.925 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.925 22:54:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.184 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.184 "name": "raid_bdev1", 00:25:26.184 "uuid": "d1ea1ab6-03a6-442a-a93c-35f46e0c1f74", 00:25:26.184 "strip_size_kb": 0, 00:25:26.184 "state": "online", 00:25:26.184 "raid_level": "raid1", 00:25:26.184 "superblock": true, 00:25:26.184 "num_base_bdevs": 2, 00:25:26.184 "num_base_bdevs_discovered": 1, 00:25:26.184 "num_base_bdevs_operational": 1, 00:25:26.184 "base_bdevs_list": [ 00:25:26.184 { 00:25:26.184 "name": null, 00:25:26.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.184 "is_configured": false, 00:25:26.184 "data_offset": 2048, 00:25:26.184 "data_size": 63488 00:25:26.184 }, 00:25:26.184 { 00:25:26.184 "name": "BaseBdev2", 00:25:26.184 "uuid": "dde473bd-cf14-5f60-95f6-6fbd2a1d3502", 00:25:26.184 "is_configured": true, 00:25:26.184 "data_offset": 2048, 00:25:26.184 "data_size": 63488 00:25:26.184 } 00:25:26.184 ] 00:25:26.184 }' 00:25:26.184 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2819211 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2819211 ']' 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2819211 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2819211 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2819211' 00:25:26.442 killing process with pid 2819211 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2819211 00:25:26.442 Received shutdown signal, test time was about 27.298048 seconds 00:25:26.442 00:25:26.442 Latency(us) 00:25:26.442 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:26.442 =================================================================================================================== 00:25:26.442 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:26.442 [2024-07-15 22:54:11.203579] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:26.442 [2024-07-15 22:54:11.203680] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:26.442 [2024-07-15 22:54:11.203729] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:26.442 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2819211 00:25:26.442 [2024-07-15 22:54:11.203740] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xde3f70 name raid_bdev1, state offline 00:25:26.442 [2024-07-15 22:54:11.227200] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:26.700 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:26.700 00:25:26.700 real 0m33.212s 00:25:26.700 user 0m52.416s 00:25:26.701 sys 0m4.778s 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:26.701 ************************************ 00:25:26.701 END TEST raid_rebuild_test_sb_io 00:25:26.701 ************************************ 00:25:26.701 22:54:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:26.701 22:54:11 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:25:26.701 22:54:11 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:25:26.701 22:54:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:26.701 22:54:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:26.701 22:54:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:26.701 ************************************ 00:25:26.701 START TEST raid_rebuild_test 00:25:26.701 ************************************ 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2824407 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2824407 /var/tmp/spdk-raid.sock 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2824407 ']' 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:26.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:26.701 22:54:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:26.959 [2024-07-15 22:54:11.629441] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:25:26.959 [2024-07-15 22:54:11.629509] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2824407 ] 00:25:26.959 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:26.959 Zero copy mechanism will not be used. 00:25:26.959 [2024-07-15 22:54:11.758418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:26.959 [2024-07-15 22:54:11.863992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:27.222 [2024-07-15 22:54:11.934227] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:27.222 [2024-07-15 22:54:11.934263] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:27.785 22:54:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:27.785 22:54:12 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:25:27.785 22:54:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:27.785 22:54:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:28.043 BaseBdev1_malloc 00:25:28.043 22:54:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:28.300 [2024-07-15 22:54:13.036931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:28.300 [2024-07-15 22:54:13.036990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:28.300 [2024-07-15 22:54:13.037017] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xae2d40 00:25:28.300 [2024-07-15 22:54:13.037031] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:28.300 [2024-07-15 22:54:13.038863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:28.300 [2024-07-15 22:54:13.038897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:28.300 BaseBdev1 00:25:28.300 22:54:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:28.300 22:54:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:28.559 BaseBdev2_malloc 00:25:28.559 22:54:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:28.908 [2024-07-15 22:54:13.532377] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:28.908 [2024-07-15 22:54:13.532433] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:28.908 [2024-07-15 22:54:13.532461] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xae3860 00:25:28.908 [2024-07-15 22:54:13.532475] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:28.908 [2024-07-15 22:54:13.534104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:28.909 [2024-07-15 22:54:13.534136] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:28.909 BaseBdev2 00:25:28.909 22:54:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:28.909 22:54:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:28.909 BaseBdev3_malloc 00:25:28.909 22:54:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:29.166 [2024-07-15 22:54:14.027086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:29.166 [2024-07-15 22:54:14.027138] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.166 [2024-07-15 22:54:14.027159] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc908f0 00:25:29.166 [2024-07-15 22:54:14.027172] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.166 [2024-07-15 22:54:14.028729] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.166 [2024-07-15 22:54:14.028759] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:29.166 BaseBdev3 00:25:29.166 22:54:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:29.166 22:54:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:29.425 BaseBdev4_malloc 00:25:29.425 22:54:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:29.683 [2024-07-15 22:54:14.521022] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:29.683 [2024-07-15 22:54:14.521074] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:29.683 [2024-07-15 22:54:14.521095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc8fad0 00:25:29.683 [2024-07-15 22:54:14.521108] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:29.683 [2024-07-15 22:54:14.522656] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:29.683 [2024-07-15 22:54:14.522689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:29.683 BaseBdev4 00:25:29.683 22:54:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:29.942 spare_malloc 00:25:29.942 22:54:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:30.199 spare_delay 00:25:30.199 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:30.458 [2024-07-15 22:54:15.252841] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:30.458 [2024-07-15 22:54:15.252901] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.458 [2024-07-15 22:54:15.252923] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc945b0 00:25:30.458 [2024-07-15 22:54:15.252941] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.458 [2024-07-15 22:54:15.254604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.458 [2024-07-15 22:54:15.254637] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:30.458 spare 00:25:30.458 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:30.717 [2024-07-15 22:54:15.497530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:30.717 [2024-07-15 22:54:15.498909] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:30.717 [2024-07-15 22:54:15.498978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:30.717 [2024-07-15 22:54:15.499025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:30.717 [2024-07-15 22:54:15.499115] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc138a0 00:25:30.717 [2024-07-15 22:54:15.499125] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:30.717 [2024-07-15 22:54:15.499361] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc8de10 00:25:30.717 [2024-07-15 22:54:15.499523] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc138a0 00:25:30.717 [2024-07-15 22:54:15.499533] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc138a0 00:25:30.717 [2024-07-15 22:54:15.499657] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.718 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.976 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.976 "name": "raid_bdev1", 00:25:30.976 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:30.976 "strip_size_kb": 0, 00:25:30.976 "state": "online", 00:25:30.976 "raid_level": "raid1", 00:25:30.977 "superblock": false, 00:25:30.977 "num_base_bdevs": 4, 00:25:30.977 "num_base_bdevs_discovered": 4, 00:25:30.977 "num_base_bdevs_operational": 4, 00:25:30.977 "base_bdevs_list": [ 00:25:30.977 { 00:25:30.977 "name": "BaseBdev1", 00:25:30.977 "uuid": "fb537a4b-0c4a-5daf-963d-27bd2c51d6e6", 00:25:30.977 "is_configured": true, 00:25:30.977 "data_offset": 0, 00:25:30.977 "data_size": 65536 00:25:30.977 }, 00:25:30.977 { 00:25:30.977 "name": "BaseBdev2", 00:25:30.977 "uuid": "6f5ada48-fb3d-5953-a9c6-2498bd1ce5ec", 00:25:30.977 "is_configured": true, 00:25:30.977 "data_offset": 0, 00:25:30.977 "data_size": 65536 00:25:30.977 }, 00:25:30.977 { 00:25:30.977 "name": "BaseBdev3", 00:25:30.977 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:30.977 "is_configured": true, 00:25:30.977 "data_offset": 0, 00:25:30.977 "data_size": 65536 00:25:30.977 }, 00:25:30.977 { 00:25:30.977 "name": "BaseBdev4", 00:25:30.977 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:30.977 "is_configured": true, 00:25:30.977 "data_offset": 0, 00:25:30.977 "data_size": 65536 00:25:30.977 } 00:25:30.977 ] 00:25:30.977 }' 00:25:30.977 22:54:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.977 22:54:15 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:31.544 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:31.544 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:31.804 [2024-07-15 22:54:16.592750] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:31.804 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:31.804 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.804 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:32.063 22:54:16 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:32.321 [2024-07-15 22:54:17.089814] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc8de10 00:25:32.321 /dev/nbd0 00:25:32.321 22:54:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:32.322 1+0 records in 00:25:32.322 1+0 records out 00:25:32.322 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247389 s, 16.6 MB/s 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:32.322 22:54:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:40.439 65536+0 records in 00:25:40.439 65536+0 records out 00:25:40.439 33554432 bytes (34 MB, 32 MiB) copied, 7.66859 s, 4.4 MB/s 00:25:40.439 22:54:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:40.439 22:54:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:40.439 22:54:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:40.439 22:54:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:40.439 22:54:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:40.439 22:54:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:40.439 22:54:24 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:40.439 [2024-07-15 22:54:25.099366] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:40.439 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:40.439 [2024-07-15 22:54:25.332040] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.696 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.953 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:40.953 "name": "raid_bdev1", 00:25:40.953 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:40.953 "strip_size_kb": 0, 00:25:40.953 "state": "online", 00:25:40.953 "raid_level": "raid1", 00:25:40.953 "superblock": false, 00:25:40.953 "num_base_bdevs": 4, 00:25:40.953 "num_base_bdevs_discovered": 3, 00:25:40.953 "num_base_bdevs_operational": 3, 00:25:40.953 "base_bdevs_list": [ 00:25:40.953 { 00:25:40.953 "name": null, 00:25:40.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:40.953 "is_configured": false, 00:25:40.953 "data_offset": 0, 00:25:40.953 "data_size": 65536 00:25:40.953 }, 00:25:40.953 { 00:25:40.953 "name": "BaseBdev2", 00:25:40.953 "uuid": "6f5ada48-fb3d-5953-a9c6-2498bd1ce5ec", 00:25:40.953 "is_configured": true, 00:25:40.953 "data_offset": 0, 00:25:40.953 "data_size": 65536 00:25:40.953 }, 00:25:40.953 { 00:25:40.953 "name": "BaseBdev3", 00:25:40.953 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:40.954 "is_configured": true, 00:25:40.954 "data_offset": 0, 00:25:40.954 "data_size": 65536 00:25:40.954 }, 00:25:40.954 { 00:25:40.954 "name": "BaseBdev4", 00:25:40.954 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:40.954 "is_configured": true, 00:25:40.954 "data_offset": 0, 00:25:40.954 "data_size": 65536 00:25:40.954 } 00:25:40.954 ] 00:25:40.954 }' 00:25:40.954 22:54:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:40.954 22:54:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:41.518 22:54:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:41.777 [2024-07-15 22:54:26.438988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:41.777 [2024-07-15 22:54:26.443145] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc196b0 00:25:41.777 [2024-07-15 22:54:26.445511] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:41.777 22:54:26 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:42.712 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:42.712 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:42.712 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:42.712 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:42.713 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:42.713 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.713 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.972 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:42.972 "name": "raid_bdev1", 00:25:42.972 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:42.972 "strip_size_kb": 0, 00:25:42.972 "state": "online", 00:25:42.972 "raid_level": "raid1", 00:25:42.972 "superblock": false, 00:25:42.972 "num_base_bdevs": 4, 00:25:42.972 "num_base_bdevs_discovered": 4, 00:25:42.972 "num_base_bdevs_operational": 4, 00:25:42.972 "process": { 00:25:42.972 "type": "rebuild", 00:25:42.972 "target": "spare", 00:25:42.972 "progress": { 00:25:42.972 "blocks": 24576, 00:25:42.972 "percent": 37 00:25:42.972 } 00:25:42.972 }, 00:25:42.972 "base_bdevs_list": [ 00:25:42.972 { 00:25:42.972 "name": "spare", 00:25:42.972 "uuid": "4efcfc75-3d8f-5b1a-90e1-4e26f2edf541", 00:25:42.972 "is_configured": true, 00:25:42.972 "data_offset": 0, 00:25:42.972 "data_size": 65536 00:25:42.972 }, 00:25:42.972 { 00:25:42.972 "name": "BaseBdev2", 00:25:42.972 "uuid": "6f5ada48-fb3d-5953-a9c6-2498bd1ce5ec", 00:25:42.972 "is_configured": true, 00:25:42.972 "data_offset": 0, 00:25:42.972 "data_size": 65536 00:25:42.972 }, 00:25:42.972 { 00:25:42.972 "name": "BaseBdev3", 00:25:42.972 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:42.972 "is_configured": true, 00:25:42.972 "data_offset": 0, 00:25:42.972 "data_size": 65536 00:25:42.972 }, 00:25:42.972 { 00:25:42.972 "name": "BaseBdev4", 00:25:42.972 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:42.972 "is_configured": true, 00:25:42.972 "data_offset": 0, 00:25:42.972 "data_size": 65536 00:25:42.972 } 00:25:42.972 ] 00:25:42.972 }' 00:25:42.972 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:42.972 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:42.972 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:42.972 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:42.972 22:54:27 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:43.231 [2024-07-15 22:54:28.029652] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:43.231 [2024-07-15 22:54:28.058226] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:43.231 [2024-07-15 22:54:28.058274] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:43.231 [2024-07-15 22:54:28.058292] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:43.231 [2024-07-15 22:54:28.058301] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.231 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.490 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.491 "name": "raid_bdev1", 00:25:43.491 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:43.491 "strip_size_kb": 0, 00:25:43.491 "state": "online", 00:25:43.491 "raid_level": "raid1", 00:25:43.491 "superblock": false, 00:25:43.491 "num_base_bdevs": 4, 00:25:43.491 "num_base_bdevs_discovered": 3, 00:25:43.491 "num_base_bdevs_operational": 3, 00:25:43.491 "base_bdevs_list": [ 00:25:43.491 { 00:25:43.491 "name": null, 00:25:43.491 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.491 "is_configured": false, 00:25:43.491 "data_offset": 0, 00:25:43.491 "data_size": 65536 00:25:43.491 }, 00:25:43.491 { 00:25:43.491 "name": "BaseBdev2", 00:25:43.491 "uuid": "6f5ada48-fb3d-5953-a9c6-2498bd1ce5ec", 00:25:43.491 "is_configured": true, 00:25:43.491 "data_offset": 0, 00:25:43.491 "data_size": 65536 00:25:43.491 }, 00:25:43.491 { 00:25:43.491 "name": "BaseBdev3", 00:25:43.491 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:43.491 "is_configured": true, 00:25:43.491 "data_offset": 0, 00:25:43.491 "data_size": 65536 00:25:43.491 }, 00:25:43.491 { 00:25:43.491 "name": "BaseBdev4", 00:25:43.491 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:43.491 "is_configured": true, 00:25:43.491 "data_offset": 0, 00:25:43.491 "data_size": 65536 00:25:43.491 } 00:25:43.491 ] 00:25:43.491 }' 00:25:43.491 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.491 22:54:28 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:44.059 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:44.059 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:44.059 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:44.059 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:44.059 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:44.059 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.059 22:54:28 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.318 22:54:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:44.318 "name": "raid_bdev1", 00:25:44.318 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:44.318 "strip_size_kb": 0, 00:25:44.318 "state": "online", 00:25:44.318 "raid_level": "raid1", 00:25:44.318 "superblock": false, 00:25:44.318 "num_base_bdevs": 4, 00:25:44.318 "num_base_bdevs_discovered": 3, 00:25:44.318 "num_base_bdevs_operational": 3, 00:25:44.318 "base_bdevs_list": [ 00:25:44.318 { 00:25:44.318 "name": null, 00:25:44.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.318 "is_configured": false, 00:25:44.318 "data_offset": 0, 00:25:44.318 "data_size": 65536 00:25:44.318 }, 00:25:44.318 { 00:25:44.318 "name": "BaseBdev2", 00:25:44.318 "uuid": "6f5ada48-fb3d-5953-a9c6-2498bd1ce5ec", 00:25:44.318 "is_configured": true, 00:25:44.318 "data_offset": 0, 00:25:44.318 "data_size": 65536 00:25:44.318 }, 00:25:44.319 { 00:25:44.319 "name": "BaseBdev3", 00:25:44.319 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:44.319 "is_configured": true, 00:25:44.319 "data_offset": 0, 00:25:44.319 "data_size": 65536 00:25:44.319 }, 00:25:44.319 { 00:25:44.319 "name": "BaseBdev4", 00:25:44.319 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:44.319 "is_configured": true, 00:25:44.319 "data_offset": 0, 00:25:44.319 "data_size": 65536 00:25:44.319 } 00:25:44.319 ] 00:25:44.319 }' 00:25:44.319 22:54:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:44.319 22:54:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:44.319 22:54:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.319 22:54:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:44.319 22:54:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:44.578 [2024-07-15 22:54:29.326276] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:44.578 [2024-07-15 22:54:29.330391] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc196b0 00:25:44.578 [2024-07-15 22:54:29.331901] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:44.578 22:54:29 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:45.515 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:45.515 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.515 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:45.515 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:45.515 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.515 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.515 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.774 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.774 "name": "raid_bdev1", 00:25:45.774 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:45.774 "strip_size_kb": 0, 00:25:45.774 "state": "online", 00:25:45.774 "raid_level": "raid1", 00:25:45.774 "superblock": false, 00:25:45.774 "num_base_bdevs": 4, 00:25:45.774 "num_base_bdevs_discovered": 4, 00:25:45.774 "num_base_bdevs_operational": 4, 00:25:45.774 "process": { 00:25:45.774 "type": "rebuild", 00:25:45.774 "target": "spare", 00:25:45.774 "progress": { 00:25:45.774 "blocks": 22528, 00:25:45.775 "percent": 34 00:25:45.775 } 00:25:45.775 }, 00:25:45.775 "base_bdevs_list": [ 00:25:45.775 { 00:25:45.775 "name": "spare", 00:25:45.775 "uuid": "4efcfc75-3d8f-5b1a-90e1-4e26f2edf541", 00:25:45.775 "is_configured": true, 00:25:45.775 "data_offset": 0, 00:25:45.775 "data_size": 65536 00:25:45.775 }, 00:25:45.775 { 00:25:45.775 "name": "BaseBdev2", 00:25:45.775 "uuid": "6f5ada48-fb3d-5953-a9c6-2498bd1ce5ec", 00:25:45.775 "is_configured": true, 00:25:45.775 "data_offset": 0, 00:25:45.775 "data_size": 65536 00:25:45.775 }, 00:25:45.775 { 00:25:45.775 "name": "BaseBdev3", 00:25:45.775 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:45.775 "is_configured": true, 00:25:45.775 "data_offset": 0, 00:25:45.775 "data_size": 65536 00:25:45.775 }, 00:25:45.775 { 00:25:45.775 "name": "BaseBdev4", 00:25:45.775 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:45.775 "is_configured": true, 00:25:45.775 "data_offset": 0, 00:25:45.775 "data_size": 65536 00:25:45.775 } 00:25:45.775 ] 00:25:45.775 }' 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:45.775 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:46.033 [2024-07-15 22:54:30.871609] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:46.292 [2024-07-15 22:54:30.944795] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xc196b0 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.292 22:54:30 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.551 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:46.551 "name": "raid_bdev1", 00:25:46.551 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:46.551 "strip_size_kb": 0, 00:25:46.551 "state": "online", 00:25:46.551 "raid_level": "raid1", 00:25:46.551 "superblock": false, 00:25:46.551 "num_base_bdevs": 4, 00:25:46.551 "num_base_bdevs_discovered": 3, 00:25:46.551 "num_base_bdevs_operational": 3, 00:25:46.551 "process": { 00:25:46.551 "type": "rebuild", 00:25:46.551 "target": "spare", 00:25:46.551 "progress": { 00:25:46.551 "blocks": 36864, 00:25:46.551 "percent": 56 00:25:46.551 } 00:25:46.551 }, 00:25:46.551 "base_bdevs_list": [ 00:25:46.551 { 00:25:46.552 "name": "spare", 00:25:46.552 "uuid": "4efcfc75-3d8f-5b1a-90e1-4e26f2edf541", 00:25:46.552 "is_configured": true, 00:25:46.552 "data_offset": 0, 00:25:46.552 "data_size": 65536 00:25:46.552 }, 00:25:46.552 { 00:25:46.552 "name": null, 00:25:46.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.552 "is_configured": false, 00:25:46.552 "data_offset": 0, 00:25:46.552 "data_size": 65536 00:25:46.552 }, 00:25:46.552 { 00:25:46.552 "name": "BaseBdev3", 00:25:46.552 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:46.552 "is_configured": true, 00:25:46.552 "data_offset": 0, 00:25:46.552 "data_size": 65536 00:25:46.552 }, 00:25:46.552 { 00:25:46.552 "name": "BaseBdev4", 00:25:46.552 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:46.552 "is_configured": true, 00:25:46.552 "data_offset": 0, 00:25:46.552 "data_size": 65536 00:25:46.552 } 00:25:46.552 ] 00:25:46.552 }' 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=919 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.552 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:46.810 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:46.810 "name": "raid_bdev1", 00:25:46.810 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:46.810 "strip_size_kb": 0, 00:25:46.810 "state": "online", 00:25:46.810 "raid_level": "raid1", 00:25:46.811 "superblock": false, 00:25:46.811 "num_base_bdevs": 4, 00:25:46.811 "num_base_bdevs_discovered": 3, 00:25:46.811 "num_base_bdevs_operational": 3, 00:25:46.811 "process": { 00:25:46.811 "type": "rebuild", 00:25:46.811 "target": "spare", 00:25:46.811 "progress": { 00:25:46.811 "blocks": 43008, 00:25:46.811 "percent": 65 00:25:46.811 } 00:25:46.811 }, 00:25:46.811 "base_bdevs_list": [ 00:25:46.811 { 00:25:46.811 "name": "spare", 00:25:46.811 "uuid": "4efcfc75-3d8f-5b1a-90e1-4e26f2edf541", 00:25:46.811 "is_configured": true, 00:25:46.811 "data_offset": 0, 00:25:46.811 "data_size": 65536 00:25:46.811 }, 00:25:46.811 { 00:25:46.811 "name": null, 00:25:46.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:46.811 "is_configured": false, 00:25:46.811 "data_offset": 0, 00:25:46.811 "data_size": 65536 00:25:46.811 }, 00:25:46.811 { 00:25:46.811 "name": "BaseBdev3", 00:25:46.811 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:46.811 "is_configured": true, 00:25:46.811 "data_offset": 0, 00:25:46.811 "data_size": 65536 00:25:46.811 }, 00:25:46.811 { 00:25:46.811 "name": "BaseBdev4", 00:25:46.811 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:46.811 "is_configured": true, 00:25:46.811 "data_offset": 0, 00:25:46.811 "data_size": 65536 00:25:46.811 } 00:25:46.811 ] 00:25:46.811 }' 00:25:46.811 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.811 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:46.811 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.811 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:46.811 22:54:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:47.748 [2024-07-15 22:54:32.557378] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:47.748 [2024-07-15 22:54:32.557448] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:47.748 [2024-07-15 22:54:32.557490] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.748 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.007 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.007 "name": "raid_bdev1", 00:25:48.007 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:48.007 "strip_size_kb": 0, 00:25:48.007 "state": "online", 00:25:48.007 "raid_level": "raid1", 00:25:48.007 "superblock": false, 00:25:48.007 "num_base_bdevs": 4, 00:25:48.007 "num_base_bdevs_discovered": 3, 00:25:48.007 "num_base_bdevs_operational": 3, 00:25:48.007 "base_bdevs_list": [ 00:25:48.007 { 00:25:48.007 "name": "spare", 00:25:48.007 "uuid": "4efcfc75-3d8f-5b1a-90e1-4e26f2edf541", 00:25:48.007 "is_configured": true, 00:25:48.007 "data_offset": 0, 00:25:48.007 "data_size": 65536 00:25:48.007 }, 00:25:48.007 { 00:25:48.007 "name": null, 00:25:48.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.007 "is_configured": false, 00:25:48.007 "data_offset": 0, 00:25:48.007 "data_size": 65536 00:25:48.007 }, 00:25:48.007 { 00:25:48.007 "name": "BaseBdev3", 00:25:48.007 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:48.007 "is_configured": true, 00:25:48.007 "data_offset": 0, 00:25:48.007 "data_size": 65536 00:25:48.007 }, 00:25:48.007 { 00:25:48.007 "name": "BaseBdev4", 00:25:48.007 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:48.007 "is_configured": true, 00:25:48.007 "data_offset": 0, 00:25:48.007 "data_size": 65536 00:25:48.007 } 00:25:48.007 ] 00:25:48.007 }' 00:25:48.007 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.298 22:54:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.565 "name": "raid_bdev1", 00:25:48.565 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:48.565 "strip_size_kb": 0, 00:25:48.565 "state": "online", 00:25:48.565 "raid_level": "raid1", 00:25:48.565 "superblock": false, 00:25:48.565 "num_base_bdevs": 4, 00:25:48.565 "num_base_bdevs_discovered": 3, 00:25:48.565 "num_base_bdevs_operational": 3, 00:25:48.565 "base_bdevs_list": [ 00:25:48.565 { 00:25:48.565 "name": "spare", 00:25:48.565 "uuid": "4efcfc75-3d8f-5b1a-90e1-4e26f2edf541", 00:25:48.565 "is_configured": true, 00:25:48.565 "data_offset": 0, 00:25:48.565 "data_size": 65536 00:25:48.565 }, 00:25:48.565 { 00:25:48.565 "name": null, 00:25:48.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.565 "is_configured": false, 00:25:48.565 "data_offset": 0, 00:25:48.565 "data_size": 65536 00:25:48.565 }, 00:25:48.565 { 00:25:48.565 "name": "BaseBdev3", 00:25:48.565 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:48.565 "is_configured": true, 00:25:48.565 "data_offset": 0, 00:25:48.565 "data_size": 65536 00:25:48.565 }, 00:25:48.565 { 00:25:48.565 "name": "BaseBdev4", 00:25:48.565 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:48.565 "is_configured": true, 00:25:48.565 "data_offset": 0, 00:25:48.565 "data_size": 65536 00:25:48.565 } 00:25:48.565 ] 00:25:48.565 }' 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.565 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.823 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:48.823 "name": "raid_bdev1", 00:25:48.823 "uuid": "5532a059-2a3f-4092-a0e0-bf068f0b0d1d", 00:25:48.823 "strip_size_kb": 0, 00:25:48.823 "state": "online", 00:25:48.823 "raid_level": "raid1", 00:25:48.823 "superblock": false, 00:25:48.823 "num_base_bdevs": 4, 00:25:48.823 "num_base_bdevs_discovered": 3, 00:25:48.823 "num_base_bdevs_operational": 3, 00:25:48.823 "base_bdevs_list": [ 00:25:48.823 { 00:25:48.823 "name": "spare", 00:25:48.823 "uuid": "4efcfc75-3d8f-5b1a-90e1-4e26f2edf541", 00:25:48.823 "is_configured": true, 00:25:48.823 "data_offset": 0, 00:25:48.823 "data_size": 65536 00:25:48.823 }, 00:25:48.823 { 00:25:48.823 "name": null, 00:25:48.823 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.823 "is_configured": false, 00:25:48.823 "data_offset": 0, 00:25:48.823 "data_size": 65536 00:25:48.823 }, 00:25:48.823 { 00:25:48.823 "name": "BaseBdev3", 00:25:48.823 "uuid": "8220e16b-ac93-529d-b9cb-f8d27df19cec", 00:25:48.823 "is_configured": true, 00:25:48.823 "data_offset": 0, 00:25:48.823 "data_size": 65536 00:25:48.823 }, 00:25:48.823 { 00:25:48.823 "name": "BaseBdev4", 00:25:48.823 "uuid": "8f7f8762-b279-5b2f-9d75-dec090d998dc", 00:25:48.823 "is_configured": true, 00:25:48.823 "data_offset": 0, 00:25:48.823 "data_size": 65536 00:25:48.823 } 00:25:48.823 ] 00:25:48.823 }' 00:25:48.823 22:54:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:48.823 22:54:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:49.388 22:54:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:49.389 [2024-07-15 22:54:34.282188] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:49.389 [2024-07-15 22:54:34.282220] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:49.389 [2024-07-15 22:54:34.282286] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:49.389 [2024-07-15 22:54:34.282355] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:49.389 [2024-07-15 22:54:34.282368] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc138a0 name raid_bdev1, state offline 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:49.646 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:49.904 /dev/nbd0 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:49.904 1+0 records in 00:25:49.904 1+0 records out 00:25:49.904 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000699878 s, 5.9 MB/s 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:49.904 22:54:34 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:50.162 /dev/nbd1 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:50.162 1+0 records in 00:25:50.162 1+0 records out 00:25:50.162 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315867 s, 13.0 MB/s 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:50.162 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:50.163 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:50.163 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:50.163 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:50.163 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:50.163 22:54:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:50.420 22:54:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:50.420 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:50.420 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:50.420 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:50.420 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:50.420 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:50.420 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:50.678 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2824407 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2824407 ']' 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2824407 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2824407 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2824407' 00:25:50.937 killing process with pid 2824407 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2824407 00:25:50.937 Received shutdown signal, test time was about 60.000000 seconds 00:25:50.937 00:25:50.937 Latency(us) 00:25:50.937 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:50.937 =================================================================================================================== 00:25:50.937 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:50.937 [2024-07-15 22:54:35.695758] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:50.937 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2824407 00:25:50.937 [2024-07-15 22:54:35.744439] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:51.196 22:54:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:25:51.196 00:25:51.196 real 0m24.411s 00:25:51.196 user 0m32.326s 00:25:51.196 sys 0m5.512s 00:25:51.196 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:51.196 22:54:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:51.196 ************************************ 00:25:51.196 END TEST raid_rebuild_test 00:25:51.196 ************************************ 00:25:51.196 22:54:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:51.196 22:54:36 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:25:51.196 22:54:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:51.196 22:54:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:51.196 22:54:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:51.196 ************************************ 00:25:51.196 START TEST raid_rebuild_test_sb 00:25:51.196 ************************************ 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2827671 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2827671 /var/tmp/spdk-raid.sock 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2827671 ']' 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:51.196 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:51.196 22:54:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:51.455 [2024-07-15 22:54:36.126912] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:25:51.455 [2024-07-15 22:54:36.126985] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2827671 ] 00:25:51.455 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:51.455 Zero copy mechanism will not be used. 00:25:51.455 [2024-07-15 22:54:36.242800] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:51.455 [2024-07-15 22:54:36.340577] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:51.714 [2024-07-15 22:54:36.409544] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:51.714 [2024-07-15 22:54:36.409601] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:52.281 22:54:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:52.281 22:54:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:25:52.281 22:54:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:52.281 22:54:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:52.540 BaseBdev1_malloc 00:25:52.541 22:54:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:52.800 [2024-07-15 22:54:37.503907] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:52.800 [2024-07-15 22:54:37.503971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:52.800 [2024-07-15 22:54:37.503996] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fe3d40 00:25:52.800 [2024-07-15 22:54:37.504009] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:52.800 [2024-07-15 22:54:37.505659] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:52.800 [2024-07-15 22:54:37.505691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:52.800 BaseBdev1 00:25:52.800 22:54:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:52.800 22:54:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:53.059 BaseBdev2_malloc 00:25:53.059 22:54:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:53.319 [2024-07-15 22:54:37.970115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:53.319 [2024-07-15 22:54:37.970163] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:53.319 [2024-07-15 22:54:37.970185] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fe4860 00:25:53.319 [2024-07-15 22:54:37.970198] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:53.319 [2024-07-15 22:54:37.971657] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:53.319 [2024-07-15 22:54:37.971685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:53.319 BaseBdev2 00:25:53.319 22:54:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:53.319 22:54:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:53.578 BaseBdev3_malloc 00:25:53.578 22:54:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:53.837 [2024-07-15 22:54:38.500205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:53.837 [2024-07-15 22:54:38.500253] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:53.837 [2024-07-15 22:54:38.500273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21918f0 00:25:53.837 [2024-07-15 22:54:38.500285] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:53.837 [2024-07-15 22:54:38.501737] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:53.837 [2024-07-15 22:54:38.501767] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:53.837 BaseBdev3 00:25:53.837 22:54:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:53.837 22:54:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:54.096 BaseBdev4_malloc 00:25:54.096 22:54:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:54.355 [2024-07-15 22:54:39.006066] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:54.355 [2024-07-15 22:54:39.006112] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:54.355 [2024-07-15 22:54:39.006131] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2190ad0 00:25:54.355 [2024-07-15 22:54:39.006144] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:54.355 [2024-07-15 22:54:39.007520] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:54.355 [2024-07-15 22:54:39.007549] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:54.355 BaseBdev4 00:25:54.355 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:54.355 spare_malloc 00:25:54.355 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:54.614 spare_delay 00:25:54.614 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:54.873 [2024-07-15 22:54:39.624278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:54.873 [2024-07-15 22:54:39.624325] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:54.873 [2024-07-15 22:54:39.624344] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21955b0 00:25:54.873 [2024-07-15 22:54:39.624357] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:54.873 [2024-07-15 22:54:39.625887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:54.873 [2024-07-15 22:54:39.625916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:54.873 spare 00:25:54.873 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:55.131 [2024-07-15 22:54:39.885002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:55.131 [2024-07-15 22:54:39.886237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:55.131 [2024-07-15 22:54:39.886293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:55.132 [2024-07-15 22:54:39.886339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:55.132 [2024-07-15 22:54:39.886538] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21148a0 00:25:55.132 [2024-07-15 22:54:39.886550] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:55.132 [2024-07-15 22:54:39.886750] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x218ee10 00:25:55.132 [2024-07-15 22:54:39.886900] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21148a0 00:25:55.132 [2024-07-15 22:54:39.886910] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21148a0 00:25:55.132 [2024-07-15 22:54:39.887013] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.132 22:54:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.391 22:54:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.391 "name": "raid_bdev1", 00:25:55.391 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:25:55.391 "strip_size_kb": 0, 00:25:55.391 "state": "online", 00:25:55.391 "raid_level": "raid1", 00:25:55.391 "superblock": true, 00:25:55.391 "num_base_bdevs": 4, 00:25:55.391 "num_base_bdevs_discovered": 4, 00:25:55.391 "num_base_bdevs_operational": 4, 00:25:55.391 "base_bdevs_list": [ 00:25:55.391 { 00:25:55.391 "name": "BaseBdev1", 00:25:55.391 "uuid": "47dae797-13c6-5fe2-bfba-255581cb134c", 00:25:55.391 "is_configured": true, 00:25:55.391 "data_offset": 2048, 00:25:55.391 "data_size": 63488 00:25:55.391 }, 00:25:55.391 { 00:25:55.391 "name": "BaseBdev2", 00:25:55.391 "uuid": "4ffa8805-2359-5cf5-8a8d-9ca6d801b467", 00:25:55.391 "is_configured": true, 00:25:55.391 "data_offset": 2048, 00:25:55.391 "data_size": 63488 00:25:55.391 }, 00:25:55.391 { 00:25:55.391 "name": "BaseBdev3", 00:25:55.391 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:25:55.391 "is_configured": true, 00:25:55.391 "data_offset": 2048, 00:25:55.391 "data_size": 63488 00:25:55.391 }, 00:25:55.391 { 00:25:55.391 "name": "BaseBdev4", 00:25:55.391 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:25:55.391 "is_configured": true, 00:25:55.391 "data_offset": 2048, 00:25:55.391 "data_size": 63488 00:25:55.391 } 00:25:55.391 ] 00:25:55.391 }' 00:25:55.391 22:54:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.391 22:54:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:55.959 22:54:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:55.959 22:54:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:56.218 [2024-07-15 22:54:40.972182] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:56.218 22:54:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:56.218 22:54:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.218 22:54:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:56.477 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:56.736 [2024-07-15 22:54:41.469239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x218ee10 00:25:56.736 /dev/nbd0 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:56.736 1+0 records in 00:25:56.736 1+0 records out 00:25:56.736 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263104 s, 15.6 MB/s 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:56.736 22:54:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:26:04.851 63488+0 records in 00:26:04.851 63488+0 records out 00:26:04.851 32505856 bytes (33 MB, 31 MiB) copied, 8.00714 s, 4.1 MB/s 00:26:04.851 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:04.851 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:04.851 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:04.851 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:04.851 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:04.851 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:04.851 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:05.111 [2024-07-15 22:54:49.825046] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:05.111 22:54:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:05.111 [2024-07-15 22:54:49.993546] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:05.111 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:05.111 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:05.111 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:05.111 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:05.370 "name": "raid_bdev1", 00:26:05.370 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:05.370 "strip_size_kb": 0, 00:26:05.370 "state": "online", 00:26:05.370 "raid_level": "raid1", 00:26:05.370 "superblock": true, 00:26:05.370 "num_base_bdevs": 4, 00:26:05.370 "num_base_bdevs_discovered": 3, 00:26:05.370 "num_base_bdevs_operational": 3, 00:26:05.370 "base_bdevs_list": [ 00:26:05.370 { 00:26:05.370 "name": null, 00:26:05.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.370 "is_configured": false, 00:26:05.370 "data_offset": 2048, 00:26:05.370 "data_size": 63488 00:26:05.370 }, 00:26:05.370 { 00:26:05.370 "name": "BaseBdev2", 00:26:05.370 "uuid": "4ffa8805-2359-5cf5-8a8d-9ca6d801b467", 00:26:05.370 "is_configured": true, 00:26:05.370 "data_offset": 2048, 00:26:05.370 "data_size": 63488 00:26:05.370 }, 00:26:05.370 { 00:26:05.370 "name": "BaseBdev3", 00:26:05.370 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:05.370 "is_configured": true, 00:26:05.370 "data_offset": 2048, 00:26:05.370 "data_size": 63488 00:26:05.370 }, 00:26:05.370 { 00:26:05.370 "name": "BaseBdev4", 00:26:05.370 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:05.370 "is_configured": true, 00:26:05.370 "data_offset": 2048, 00:26:05.370 "data_size": 63488 00:26:05.370 } 00:26:05.370 ] 00:26:05.370 }' 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:05.370 22:54:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:05.939 22:54:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:06.198 [2024-07-15 22:54:51.064418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:06.198 [2024-07-15 22:54:51.068577] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x218ee10 00:26:06.198 [2024-07-15 22:54:51.070960] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:06.198 22:54:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.577 "name": "raid_bdev1", 00:26:07.577 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:07.577 "strip_size_kb": 0, 00:26:07.577 "state": "online", 00:26:07.577 "raid_level": "raid1", 00:26:07.577 "superblock": true, 00:26:07.577 "num_base_bdevs": 4, 00:26:07.577 "num_base_bdevs_discovered": 4, 00:26:07.577 "num_base_bdevs_operational": 4, 00:26:07.577 "process": { 00:26:07.577 "type": "rebuild", 00:26:07.577 "target": "spare", 00:26:07.577 "progress": { 00:26:07.577 "blocks": 22528, 00:26:07.577 "percent": 35 00:26:07.577 } 00:26:07.577 }, 00:26:07.577 "base_bdevs_list": [ 00:26:07.577 { 00:26:07.577 "name": "spare", 00:26:07.577 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:07.577 "is_configured": true, 00:26:07.577 "data_offset": 2048, 00:26:07.577 "data_size": 63488 00:26:07.577 }, 00:26:07.577 { 00:26:07.577 "name": "BaseBdev2", 00:26:07.577 "uuid": "4ffa8805-2359-5cf5-8a8d-9ca6d801b467", 00:26:07.577 "is_configured": true, 00:26:07.577 "data_offset": 2048, 00:26:07.577 "data_size": 63488 00:26:07.577 }, 00:26:07.577 { 00:26:07.577 "name": "BaseBdev3", 00:26:07.577 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:07.577 "is_configured": true, 00:26:07.577 "data_offset": 2048, 00:26:07.577 "data_size": 63488 00:26:07.577 }, 00:26:07.577 { 00:26:07.577 "name": "BaseBdev4", 00:26:07.577 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:07.577 "is_configured": true, 00:26:07.577 "data_offset": 2048, 00:26:07.577 "data_size": 63488 00:26:07.577 } 00:26:07.577 ] 00:26:07.577 }' 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:07.577 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:07.837 [2024-07-15 22:54:52.601638] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.837 [2024-07-15 22:54:52.683818] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:07.837 [2024-07-15 22:54:52.683865] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:07.837 [2024-07-15 22:54:52.683883] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:07.837 [2024-07-15 22:54:52.683891] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.837 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.165 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.165 "name": "raid_bdev1", 00:26:08.165 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:08.165 "strip_size_kb": 0, 00:26:08.165 "state": "online", 00:26:08.165 "raid_level": "raid1", 00:26:08.165 "superblock": true, 00:26:08.165 "num_base_bdevs": 4, 00:26:08.165 "num_base_bdevs_discovered": 3, 00:26:08.165 "num_base_bdevs_operational": 3, 00:26:08.165 "base_bdevs_list": [ 00:26:08.165 { 00:26:08.165 "name": null, 00:26:08.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.165 "is_configured": false, 00:26:08.165 "data_offset": 2048, 00:26:08.165 "data_size": 63488 00:26:08.165 }, 00:26:08.165 { 00:26:08.165 "name": "BaseBdev2", 00:26:08.165 "uuid": "4ffa8805-2359-5cf5-8a8d-9ca6d801b467", 00:26:08.165 "is_configured": true, 00:26:08.165 "data_offset": 2048, 00:26:08.165 "data_size": 63488 00:26:08.165 }, 00:26:08.165 { 00:26:08.165 "name": "BaseBdev3", 00:26:08.165 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:08.165 "is_configured": true, 00:26:08.165 "data_offset": 2048, 00:26:08.165 "data_size": 63488 00:26:08.165 }, 00:26:08.165 { 00:26:08.165 "name": "BaseBdev4", 00:26:08.165 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:08.165 "is_configured": true, 00:26:08.165 "data_offset": 2048, 00:26:08.165 "data_size": 63488 00:26:08.165 } 00:26:08.165 ] 00:26:08.165 }' 00:26:08.165 22:54:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.165 22:54:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:08.734 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:08.734 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.734 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:08.734 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:08.734 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.734 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.734 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.993 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.993 "name": "raid_bdev1", 00:26:08.993 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:08.993 "strip_size_kb": 0, 00:26:08.993 "state": "online", 00:26:08.993 "raid_level": "raid1", 00:26:08.993 "superblock": true, 00:26:08.993 "num_base_bdevs": 4, 00:26:08.993 "num_base_bdevs_discovered": 3, 00:26:08.993 "num_base_bdevs_operational": 3, 00:26:08.993 "base_bdevs_list": [ 00:26:08.993 { 00:26:08.993 "name": null, 00:26:08.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.993 "is_configured": false, 00:26:08.993 "data_offset": 2048, 00:26:08.993 "data_size": 63488 00:26:08.993 }, 00:26:08.993 { 00:26:08.993 "name": "BaseBdev2", 00:26:08.993 "uuid": "4ffa8805-2359-5cf5-8a8d-9ca6d801b467", 00:26:08.993 "is_configured": true, 00:26:08.993 "data_offset": 2048, 00:26:08.993 "data_size": 63488 00:26:08.993 }, 00:26:08.993 { 00:26:08.993 "name": "BaseBdev3", 00:26:08.993 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:08.993 "is_configured": true, 00:26:08.994 "data_offset": 2048, 00:26:08.994 "data_size": 63488 00:26:08.994 }, 00:26:08.994 { 00:26:08.994 "name": "BaseBdev4", 00:26:08.994 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:08.994 "is_configured": true, 00:26:08.994 "data_offset": 2048, 00:26:08.994 "data_size": 63488 00:26:08.994 } 00:26:08.994 ] 00:26:08.994 }' 00:26:08.994 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.994 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:08.994 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:09.252 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:09.252 22:54:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:09.252 [2024-07-15 22:54:54.152097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:09.252 [2024-07-15 22:54:54.156788] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ceafa0 00:26:09.252 [2024-07-15 22:54:54.158347] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:09.509 22:54:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:10.445 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:10.445 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:10.445 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:10.445 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:10.445 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:10.445 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.445 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:10.704 "name": "raid_bdev1", 00:26:10.704 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:10.704 "strip_size_kb": 0, 00:26:10.704 "state": "online", 00:26:10.704 "raid_level": "raid1", 00:26:10.704 "superblock": true, 00:26:10.704 "num_base_bdevs": 4, 00:26:10.704 "num_base_bdevs_discovered": 4, 00:26:10.704 "num_base_bdevs_operational": 4, 00:26:10.704 "process": { 00:26:10.704 "type": "rebuild", 00:26:10.704 "target": "spare", 00:26:10.704 "progress": { 00:26:10.704 "blocks": 24576, 00:26:10.704 "percent": 38 00:26:10.704 } 00:26:10.704 }, 00:26:10.704 "base_bdevs_list": [ 00:26:10.704 { 00:26:10.704 "name": "spare", 00:26:10.704 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:10.704 "is_configured": true, 00:26:10.704 "data_offset": 2048, 00:26:10.704 "data_size": 63488 00:26:10.704 }, 00:26:10.704 { 00:26:10.704 "name": "BaseBdev2", 00:26:10.704 "uuid": "4ffa8805-2359-5cf5-8a8d-9ca6d801b467", 00:26:10.704 "is_configured": true, 00:26:10.704 "data_offset": 2048, 00:26:10.704 "data_size": 63488 00:26:10.704 }, 00:26:10.704 { 00:26:10.704 "name": "BaseBdev3", 00:26:10.704 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:10.704 "is_configured": true, 00:26:10.704 "data_offset": 2048, 00:26:10.704 "data_size": 63488 00:26:10.704 }, 00:26:10.704 { 00:26:10.704 "name": "BaseBdev4", 00:26:10.704 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:10.704 "is_configured": true, 00:26:10.704 "data_offset": 2048, 00:26:10.704 "data_size": 63488 00:26:10.704 } 00:26:10.704 ] 00:26:10.704 }' 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:10.704 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:10.704 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:10.963 [2024-07-15 22:54:55.737907] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:10.963 [2024-07-15 22:54:55.871203] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1ceafa0 00:26:11.221 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:11.221 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:11.221 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:11.221 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.221 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:11.222 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:11.222 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.222 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.222 22:54:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.480 "name": "raid_bdev1", 00:26:11.480 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:11.480 "strip_size_kb": 0, 00:26:11.480 "state": "online", 00:26:11.480 "raid_level": "raid1", 00:26:11.480 "superblock": true, 00:26:11.480 "num_base_bdevs": 4, 00:26:11.480 "num_base_bdevs_discovered": 3, 00:26:11.480 "num_base_bdevs_operational": 3, 00:26:11.480 "process": { 00:26:11.480 "type": "rebuild", 00:26:11.480 "target": "spare", 00:26:11.480 "progress": { 00:26:11.480 "blocks": 36864, 00:26:11.480 "percent": 58 00:26:11.480 } 00:26:11.480 }, 00:26:11.480 "base_bdevs_list": [ 00:26:11.480 { 00:26:11.480 "name": "spare", 00:26:11.480 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:11.480 "is_configured": true, 00:26:11.480 "data_offset": 2048, 00:26:11.480 "data_size": 63488 00:26:11.480 }, 00:26:11.480 { 00:26:11.480 "name": null, 00:26:11.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.480 "is_configured": false, 00:26:11.480 "data_offset": 2048, 00:26:11.480 "data_size": 63488 00:26:11.480 }, 00:26:11.480 { 00:26:11.480 "name": "BaseBdev3", 00:26:11.480 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:11.480 "is_configured": true, 00:26:11.480 "data_offset": 2048, 00:26:11.480 "data_size": 63488 00:26:11.480 }, 00:26:11.480 { 00:26:11.480 "name": "BaseBdev4", 00:26:11.480 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:11.480 "is_configured": true, 00:26:11.480 "data_offset": 2048, 00:26:11.480 "data_size": 63488 00:26:11.480 } 00:26:11.480 ] 00:26:11.480 }' 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=944 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.480 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.738 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.739 "name": "raid_bdev1", 00:26:11.739 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:11.739 "strip_size_kb": 0, 00:26:11.739 "state": "online", 00:26:11.739 "raid_level": "raid1", 00:26:11.739 "superblock": true, 00:26:11.739 "num_base_bdevs": 4, 00:26:11.739 "num_base_bdevs_discovered": 3, 00:26:11.739 "num_base_bdevs_operational": 3, 00:26:11.739 "process": { 00:26:11.739 "type": "rebuild", 00:26:11.739 "target": "spare", 00:26:11.739 "progress": { 00:26:11.739 "blocks": 43008, 00:26:11.739 "percent": 67 00:26:11.739 } 00:26:11.739 }, 00:26:11.739 "base_bdevs_list": [ 00:26:11.739 { 00:26:11.739 "name": "spare", 00:26:11.739 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:11.739 "is_configured": true, 00:26:11.739 "data_offset": 2048, 00:26:11.739 "data_size": 63488 00:26:11.739 }, 00:26:11.739 { 00:26:11.739 "name": null, 00:26:11.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.739 "is_configured": false, 00:26:11.739 "data_offset": 2048, 00:26:11.739 "data_size": 63488 00:26:11.739 }, 00:26:11.739 { 00:26:11.739 "name": "BaseBdev3", 00:26:11.739 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:11.739 "is_configured": true, 00:26:11.739 "data_offset": 2048, 00:26:11.739 "data_size": 63488 00:26:11.739 }, 00:26:11.739 { 00:26:11.739 "name": "BaseBdev4", 00:26:11.739 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:11.739 "is_configured": true, 00:26:11.739 "data_offset": 2048, 00:26:11.739 "data_size": 63488 00:26:11.739 } 00:26:11.739 ] 00:26:11.739 }' 00:26:11.739 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.739 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:11.739 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:11.739 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:11.739 22:54:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:12.673 [2024-07-15 22:54:57.383048] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:12.673 [2024-07-15 22:54:57.383114] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:12.673 [2024-07-15 22:54:57.383212] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.931 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:12.931 "name": "raid_bdev1", 00:26:12.931 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:12.931 "strip_size_kb": 0, 00:26:12.931 "state": "online", 00:26:12.931 "raid_level": "raid1", 00:26:12.931 "superblock": true, 00:26:12.931 "num_base_bdevs": 4, 00:26:12.931 "num_base_bdevs_discovered": 3, 00:26:12.931 "num_base_bdevs_operational": 3, 00:26:12.931 "base_bdevs_list": [ 00:26:12.931 { 00:26:12.931 "name": "spare", 00:26:12.931 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:12.931 "is_configured": true, 00:26:12.931 "data_offset": 2048, 00:26:12.931 "data_size": 63488 00:26:12.931 }, 00:26:12.931 { 00:26:12.931 "name": null, 00:26:12.931 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.931 "is_configured": false, 00:26:12.931 "data_offset": 2048, 00:26:12.931 "data_size": 63488 00:26:12.931 }, 00:26:12.931 { 00:26:12.931 "name": "BaseBdev3", 00:26:12.931 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:12.931 "is_configured": true, 00:26:12.931 "data_offset": 2048, 00:26:12.931 "data_size": 63488 00:26:12.931 }, 00:26:12.931 { 00:26:12.931 "name": "BaseBdev4", 00:26:12.931 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:12.931 "is_configured": true, 00:26:12.931 "data_offset": 2048, 00:26:12.931 "data_size": 63488 00:26:12.931 } 00:26:12.931 ] 00:26:12.931 }' 00:26:12.932 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.190 22:54:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.448 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:13.448 "name": "raid_bdev1", 00:26:13.448 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:13.448 "strip_size_kb": 0, 00:26:13.448 "state": "online", 00:26:13.448 "raid_level": "raid1", 00:26:13.448 "superblock": true, 00:26:13.448 "num_base_bdevs": 4, 00:26:13.448 "num_base_bdevs_discovered": 3, 00:26:13.448 "num_base_bdevs_operational": 3, 00:26:13.448 "base_bdevs_list": [ 00:26:13.448 { 00:26:13.449 "name": "spare", 00:26:13.449 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:13.449 "is_configured": true, 00:26:13.449 "data_offset": 2048, 00:26:13.449 "data_size": 63488 00:26:13.449 }, 00:26:13.449 { 00:26:13.449 "name": null, 00:26:13.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.449 "is_configured": false, 00:26:13.449 "data_offset": 2048, 00:26:13.449 "data_size": 63488 00:26:13.449 }, 00:26:13.449 { 00:26:13.449 "name": "BaseBdev3", 00:26:13.449 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:13.449 "is_configured": true, 00:26:13.449 "data_offset": 2048, 00:26:13.449 "data_size": 63488 00:26:13.449 }, 00:26:13.449 { 00:26:13.449 "name": "BaseBdev4", 00:26:13.449 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:13.449 "is_configured": true, 00:26:13.449 "data_offset": 2048, 00:26:13.449 "data_size": 63488 00:26:13.449 } 00:26:13.449 ] 00:26:13.449 }' 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.449 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.707 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.707 "name": "raid_bdev1", 00:26:13.707 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:13.707 "strip_size_kb": 0, 00:26:13.707 "state": "online", 00:26:13.707 "raid_level": "raid1", 00:26:13.707 "superblock": true, 00:26:13.707 "num_base_bdevs": 4, 00:26:13.707 "num_base_bdevs_discovered": 3, 00:26:13.707 "num_base_bdevs_operational": 3, 00:26:13.707 "base_bdevs_list": [ 00:26:13.707 { 00:26:13.707 "name": "spare", 00:26:13.707 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:13.707 "is_configured": true, 00:26:13.707 "data_offset": 2048, 00:26:13.707 "data_size": 63488 00:26:13.707 }, 00:26:13.707 { 00:26:13.707 "name": null, 00:26:13.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.707 "is_configured": false, 00:26:13.707 "data_offset": 2048, 00:26:13.707 "data_size": 63488 00:26:13.707 }, 00:26:13.707 { 00:26:13.707 "name": "BaseBdev3", 00:26:13.707 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:13.707 "is_configured": true, 00:26:13.707 "data_offset": 2048, 00:26:13.707 "data_size": 63488 00:26:13.707 }, 00:26:13.707 { 00:26:13.707 "name": "BaseBdev4", 00:26:13.707 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:13.707 "is_configured": true, 00:26:13.707 "data_offset": 2048, 00:26:13.707 "data_size": 63488 00:26:13.707 } 00:26:13.707 ] 00:26:13.707 }' 00:26:13.707 22:54:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.707 22:54:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:14.274 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:14.533 [2024-07-15 22:54:59.260946] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:14.533 [2024-07-15 22:54:59.260975] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:14.533 [2024-07-15 22:54:59.261035] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:14.533 [2024-07-15 22:54:59.261105] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:14.533 [2024-07-15 22:54:59.261117] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21148a0 name raid_bdev1, state offline 00:26:14.533 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.533 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:14.792 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:15.051 /dev/nbd0 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:15.051 1+0 records in 00:26:15.051 1+0 records out 00:26:15.051 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000380175 s, 10.8 MB/s 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:15.051 22:54:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:15.308 /dev/nbd1 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:15.309 1+0 records in 00:26:15.309 1+0 records out 00:26:15.309 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263317 s, 15.6 MB/s 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:15.309 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:15.566 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:15.824 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:16.082 22:55:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:16.340 [2024-07-15 22:55:01.133082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:16.340 [2024-07-15 22:55:01.133130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.340 [2024-07-15 22:55:01.133151] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2119930 00:26:16.340 [2024-07-15 22:55:01.133164] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.341 [2024-07-15 22:55:01.134825] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.341 [2024-07-15 22:55:01.134857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:16.341 [2024-07-15 22:55:01.134954] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:16.341 [2024-07-15 22:55:01.134982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:16.341 [2024-07-15 22:55:01.135089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:16.341 [2024-07-15 22:55:01.135162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:16.341 spare 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.341 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.341 [2024-07-15 22:55:01.235484] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2115b10 00:26:16.341 [2024-07-15 22:55:01.235501] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:16.341 [2024-07-15 22:55:01.235716] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x218e7e0 00:26:16.341 [2024-07-15 22:55:01.235873] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2115b10 00:26:16.341 [2024-07-15 22:55:01.235883] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2115b10 00:26:16.341 [2024-07-15 22:55:01.236007] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:16.599 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.599 "name": "raid_bdev1", 00:26:16.599 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:16.599 "strip_size_kb": 0, 00:26:16.599 "state": "online", 00:26:16.599 "raid_level": "raid1", 00:26:16.599 "superblock": true, 00:26:16.599 "num_base_bdevs": 4, 00:26:16.599 "num_base_bdevs_discovered": 3, 00:26:16.599 "num_base_bdevs_operational": 3, 00:26:16.599 "base_bdevs_list": [ 00:26:16.599 { 00:26:16.599 "name": "spare", 00:26:16.599 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:16.599 "is_configured": true, 00:26:16.599 "data_offset": 2048, 00:26:16.599 "data_size": 63488 00:26:16.599 }, 00:26:16.599 { 00:26:16.599 "name": null, 00:26:16.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.599 "is_configured": false, 00:26:16.599 "data_offset": 2048, 00:26:16.599 "data_size": 63488 00:26:16.599 }, 00:26:16.599 { 00:26:16.599 "name": "BaseBdev3", 00:26:16.599 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:16.599 "is_configured": true, 00:26:16.599 "data_offset": 2048, 00:26:16.599 "data_size": 63488 00:26:16.599 }, 00:26:16.599 { 00:26:16.599 "name": "BaseBdev4", 00:26:16.599 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:16.599 "is_configured": true, 00:26:16.599 "data_offset": 2048, 00:26:16.599 "data_size": 63488 00:26:16.599 } 00:26:16.599 ] 00:26:16.599 }' 00:26:16.599 22:55:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.599 22:55:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:17.534 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:17.534 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:17.534 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:17.534 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:17.534 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:17.534 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.534 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.792 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:17.792 "name": "raid_bdev1", 00:26:17.792 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:17.792 "strip_size_kb": 0, 00:26:17.792 "state": "online", 00:26:17.792 "raid_level": "raid1", 00:26:17.792 "superblock": true, 00:26:17.792 "num_base_bdevs": 4, 00:26:17.792 "num_base_bdevs_discovered": 3, 00:26:17.792 "num_base_bdevs_operational": 3, 00:26:17.792 "base_bdevs_list": [ 00:26:17.792 { 00:26:17.792 "name": "spare", 00:26:17.792 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:17.792 "is_configured": true, 00:26:17.792 "data_offset": 2048, 00:26:17.792 "data_size": 63488 00:26:17.792 }, 00:26:17.792 { 00:26:17.792 "name": null, 00:26:17.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.792 "is_configured": false, 00:26:17.792 "data_offset": 2048, 00:26:17.792 "data_size": 63488 00:26:17.792 }, 00:26:17.792 { 00:26:17.792 "name": "BaseBdev3", 00:26:17.792 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:17.792 "is_configured": true, 00:26:17.792 "data_offset": 2048, 00:26:17.792 "data_size": 63488 00:26:17.792 }, 00:26:17.792 { 00:26:17.792 "name": "BaseBdev4", 00:26:17.792 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:17.792 "is_configured": true, 00:26:17.792 "data_offset": 2048, 00:26:17.792 "data_size": 63488 00:26:17.792 } 00:26:17.792 ] 00:26:17.792 }' 00:26:17.792 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:17.792 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:17.792 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:17.792 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:17.792 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.792 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:18.050 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:18.050 22:55:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:18.309 [2024-07-15 22:55:03.130513] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.309 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.568 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:18.568 "name": "raid_bdev1", 00:26:18.568 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:18.568 "strip_size_kb": 0, 00:26:18.568 "state": "online", 00:26:18.568 "raid_level": "raid1", 00:26:18.568 "superblock": true, 00:26:18.568 "num_base_bdevs": 4, 00:26:18.568 "num_base_bdevs_discovered": 2, 00:26:18.568 "num_base_bdevs_operational": 2, 00:26:18.568 "base_bdevs_list": [ 00:26:18.568 { 00:26:18.568 "name": null, 00:26:18.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.568 "is_configured": false, 00:26:18.568 "data_offset": 2048, 00:26:18.568 "data_size": 63488 00:26:18.568 }, 00:26:18.568 { 00:26:18.568 "name": null, 00:26:18.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:18.568 "is_configured": false, 00:26:18.568 "data_offset": 2048, 00:26:18.568 "data_size": 63488 00:26:18.568 }, 00:26:18.568 { 00:26:18.568 "name": "BaseBdev3", 00:26:18.568 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:18.568 "is_configured": true, 00:26:18.568 "data_offset": 2048, 00:26:18.568 "data_size": 63488 00:26:18.568 }, 00:26:18.568 { 00:26:18.568 "name": "BaseBdev4", 00:26:18.568 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:18.568 "is_configured": true, 00:26:18.568 "data_offset": 2048, 00:26:18.568 "data_size": 63488 00:26:18.568 } 00:26:18.568 ] 00:26:18.568 }' 00:26:18.568 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:18.568 22:55:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:19.132 22:55:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:19.391 [2024-07-15 22:55:04.213398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:19.391 [2024-07-15 22:55:04.213566] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:19.391 [2024-07-15 22:55:04.213582] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:19.391 [2024-07-15 22:55:04.213611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:19.391 [2024-07-15 22:55:04.218137] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x218edf0 00:26:19.391 [2024-07-15 22:55:04.220549] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:19.391 22:55:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.769 "name": "raid_bdev1", 00:26:20.769 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:20.769 "strip_size_kb": 0, 00:26:20.769 "state": "online", 00:26:20.769 "raid_level": "raid1", 00:26:20.769 "superblock": true, 00:26:20.769 "num_base_bdevs": 4, 00:26:20.769 "num_base_bdevs_discovered": 3, 00:26:20.769 "num_base_bdevs_operational": 3, 00:26:20.769 "process": { 00:26:20.769 "type": "rebuild", 00:26:20.769 "target": "spare", 00:26:20.769 "progress": { 00:26:20.769 "blocks": 24576, 00:26:20.769 "percent": 38 00:26:20.769 } 00:26:20.769 }, 00:26:20.769 "base_bdevs_list": [ 00:26:20.769 { 00:26:20.769 "name": "spare", 00:26:20.769 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:20.769 "is_configured": true, 00:26:20.769 "data_offset": 2048, 00:26:20.769 "data_size": 63488 00:26:20.769 }, 00:26:20.769 { 00:26:20.769 "name": null, 00:26:20.769 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.769 "is_configured": false, 00:26:20.769 "data_offset": 2048, 00:26:20.769 "data_size": 63488 00:26:20.769 }, 00:26:20.769 { 00:26:20.769 "name": "BaseBdev3", 00:26:20.769 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:20.769 "is_configured": true, 00:26:20.769 "data_offset": 2048, 00:26:20.769 "data_size": 63488 00:26:20.769 }, 00:26:20.769 { 00:26:20.769 "name": "BaseBdev4", 00:26:20.769 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:20.769 "is_configured": true, 00:26:20.769 "data_offset": 2048, 00:26:20.769 "data_size": 63488 00:26:20.769 } 00:26:20.769 ] 00:26:20.769 }' 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:20.769 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:21.028 [2024-07-15 22:55:05.794961] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:21.028 [2024-07-15 22:55:05.832938] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:21.028 [2024-07-15 22:55:05.832980] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:21.028 [2024-07-15 22:55:05.833003] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:21.028 [2024-07-15 22:55:05.833012] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.028 22:55:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.286 22:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:21.286 "name": "raid_bdev1", 00:26:21.286 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:21.286 "strip_size_kb": 0, 00:26:21.286 "state": "online", 00:26:21.286 "raid_level": "raid1", 00:26:21.286 "superblock": true, 00:26:21.286 "num_base_bdevs": 4, 00:26:21.286 "num_base_bdevs_discovered": 2, 00:26:21.286 "num_base_bdevs_operational": 2, 00:26:21.286 "base_bdevs_list": [ 00:26:21.286 { 00:26:21.286 "name": null, 00:26:21.286 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.286 "is_configured": false, 00:26:21.286 "data_offset": 2048, 00:26:21.287 "data_size": 63488 00:26:21.287 }, 00:26:21.287 { 00:26:21.287 "name": null, 00:26:21.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:21.287 "is_configured": false, 00:26:21.287 "data_offset": 2048, 00:26:21.287 "data_size": 63488 00:26:21.287 }, 00:26:21.287 { 00:26:21.287 "name": "BaseBdev3", 00:26:21.287 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:21.287 "is_configured": true, 00:26:21.287 "data_offset": 2048, 00:26:21.287 "data_size": 63488 00:26:21.287 }, 00:26:21.287 { 00:26:21.287 "name": "BaseBdev4", 00:26:21.287 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:21.287 "is_configured": true, 00:26:21.287 "data_offset": 2048, 00:26:21.287 "data_size": 63488 00:26:21.287 } 00:26:21.287 ] 00:26:21.287 }' 00:26:21.287 22:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:21.287 22:55:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:21.854 22:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:22.113 [2024-07-15 22:55:06.975976] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:22.113 [2024-07-15 22:55:06.976034] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:22.113 [2024-07-15 22:55:06.976058] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2116870 00:26:22.113 [2024-07-15 22:55:06.976071] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:22.113 [2024-07-15 22:55:06.976486] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:22.113 [2024-07-15 22:55:06.976507] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:22.113 [2024-07-15 22:55:06.976600] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:22.113 [2024-07-15 22:55:06.976614] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:22.113 [2024-07-15 22:55:06.976626] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:22.113 [2024-07-15 22:55:06.976646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:22.113 [2024-07-15 22:55:06.981225] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe3320 00:26:22.113 spare 00:26:22.113 [2024-07-15 22:55:06.982671] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:22.113 22:55:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.524 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.524 "name": "raid_bdev1", 00:26:23.524 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:23.524 "strip_size_kb": 0, 00:26:23.524 "state": "online", 00:26:23.524 "raid_level": "raid1", 00:26:23.524 "superblock": true, 00:26:23.524 "num_base_bdevs": 4, 00:26:23.524 "num_base_bdevs_discovered": 3, 00:26:23.524 "num_base_bdevs_operational": 3, 00:26:23.524 "process": { 00:26:23.524 "type": "rebuild", 00:26:23.524 "target": "spare", 00:26:23.524 "progress": { 00:26:23.524 "blocks": 24576, 00:26:23.524 "percent": 38 00:26:23.524 } 00:26:23.524 }, 00:26:23.524 "base_bdevs_list": [ 00:26:23.524 { 00:26:23.524 "name": "spare", 00:26:23.524 "uuid": "70c99416-7945-5c14-9743-97f8ea6c8373", 00:26:23.524 "is_configured": true, 00:26:23.524 "data_offset": 2048, 00:26:23.524 "data_size": 63488 00:26:23.524 }, 00:26:23.524 { 00:26:23.524 "name": null, 00:26:23.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.524 "is_configured": false, 00:26:23.525 "data_offset": 2048, 00:26:23.525 "data_size": 63488 00:26:23.525 }, 00:26:23.525 { 00:26:23.525 "name": "BaseBdev3", 00:26:23.525 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:23.525 "is_configured": true, 00:26:23.525 "data_offset": 2048, 00:26:23.525 "data_size": 63488 00:26:23.525 }, 00:26:23.525 { 00:26:23.525 "name": "BaseBdev4", 00:26:23.525 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:23.525 "is_configured": true, 00:26:23.525 "data_offset": 2048, 00:26:23.525 "data_size": 63488 00:26:23.525 } 00:26:23.525 ] 00:26:23.525 }' 00:26:23.525 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.525 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:23.525 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.525 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:23.525 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:23.783 [2024-07-15 22:55:08.557603] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:23.783 [2024-07-15 22:55:08.595729] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:23.783 [2024-07-15 22:55:08.595772] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:23.783 [2024-07-15 22:55:08.595789] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:23.783 [2024-07-15 22:55:08.595797] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.783 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.041 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.041 "name": "raid_bdev1", 00:26:24.041 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:24.041 "strip_size_kb": 0, 00:26:24.041 "state": "online", 00:26:24.041 "raid_level": "raid1", 00:26:24.041 "superblock": true, 00:26:24.041 "num_base_bdevs": 4, 00:26:24.041 "num_base_bdevs_discovered": 2, 00:26:24.041 "num_base_bdevs_operational": 2, 00:26:24.041 "base_bdevs_list": [ 00:26:24.041 { 00:26:24.041 "name": null, 00:26:24.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.041 "is_configured": false, 00:26:24.041 "data_offset": 2048, 00:26:24.041 "data_size": 63488 00:26:24.041 }, 00:26:24.041 { 00:26:24.041 "name": null, 00:26:24.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.041 "is_configured": false, 00:26:24.041 "data_offset": 2048, 00:26:24.041 "data_size": 63488 00:26:24.041 }, 00:26:24.041 { 00:26:24.041 "name": "BaseBdev3", 00:26:24.041 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:24.041 "is_configured": true, 00:26:24.041 "data_offset": 2048, 00:26:24.041 "data_size": 63488 00:26:24.041 }, 00:26:24.041 { 00:26:24.041 "name": "BaseBdev4", 00:26:24.041 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:24.041 "is_configured": true, 00:26:24.041 "data_offset": 2048, 00:26:24.041 "data_size": 63488 00:26:24.041 } 00:26:24.041 ] 00:26:24.041 }' 00:26:24.041 22:55:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.041 22:55:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:24.607 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:24.607 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:24.607 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:24.607 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:24.607 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:24.607 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.607 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.865 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:24.865 "name": "raid_bdev1", 00:26:24.865 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:24.865 "strip_size_kb": 0, 00:26:24.865 "state": "online", 00:26:24.865 "raid_level": "raid1", 00:26:24.865 "superblock": true, 00:26:24.865 "num_base_bdevs": 4, 00:26:24.865 "num_base_bdevs_discovered": 2, 00:26:24.865 "num_base_bdevs_operational": 2, 00:26:24.865 "base_bdevs_list": [ 00:26:24.865 { 00:26:24.865 "name": null, 00:26:24.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.865 "is_configured": false, 00:26:24.865 "data_offset": 2048, 00:26:24.865 "data_size": 63488 00:26:24.865 }, 00:26:24.865 { 00:26:24.865 "name": null, 00:26:24.865 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.865 "is_configured": false, 00:26:24.865 "data_offset": 2048, 00:26:24.865 "data_size": 63488 00:26:24.865 }, 00:26:24.865 { 00:26:24.865 "name": "BaseBdev3", 00:26:24.865 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:24.865 "is_configured": true, 00:26:24.865 "data_offset": 2048, 00:26:24.865 "data_size": 63488 00:26:24.865 }, 00:26:24.865 { 00:26:24.865 "name": "BaseBdev4", 00:26:24.865 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:24.865 "is_configured": true, 00:26:24.865 "data_offset": 2048, 00:26:24.865 "data_size": 63488 00:26:24.865 } 00:26:24.865 ] 00:26:24.865 }' 00:26:24.865 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:24.865 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:24.865 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.124 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:25.124 22:55:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:25.382 22:55:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:25.382 [2024-07-15 22:55:10.264802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:25.382 [2024-07-15 22:55:10.264863] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:25.382 [2024-07-15 22:55:10.264887] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2119b60 00:26:25.382 [2024-07-15 22:55:10.264900] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:25.382 [2024-07-15 22:55:10.265297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:25.382 [2024-07-15 22:55:10.265320] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:25.382 [2024-07-15 22:55:10.265398] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:25.382 [2024-07-15 22:55:10.265411] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:25.382 [2024-07-15 22:55:10.265423] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:25.382 BaseBdev1 00:26:25.382 22:55:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.752 "name": "raid_bdev1", 00:26:26.752 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:26.752 "strip_size_kb": 0, 00:26:26.752 "state": "online", 00:26:26.752 "raid_level": "raid1", 00:26:26.752 "superblock": true, 00:26:26.752 "num_base_bdevs": 4, 00:26:26.752 "num_base_bdevs_discovered": 2, 00:26:26.752 "num_base_bdevs_operational": 2, 00:26:26.752 "base_bdevs_list": [ 00:26:26.752 { 00:26:26.752 "name": null, 00:26:26.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.752 "is_configured": false, 00:26:26.752 "data_offset": 2048, 00:26:26.752 "data_size": 63488 00:26:26.752 }, 00:26:26.752 { 00:26:26.752 "name": null, 00:26:26.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.752 "is_configured": false, 00:26:26.752 "data_offset": 2048, 00:26:26.752 "data_size": 63488 00:26:26.752 }, 00:26:26.752 { 00:26:26.752 "name": "BaseBdev3", 00:26:26.752 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:26.752 "is_configured": true, 00:26:26.752 "data_offset": 2048, 00:26:26.752 "data_size": 63488 00:26:26.752 }, 00:26:26.752 { 00:26:26.752 "name": "BaseBdev4", 00:26:26.752 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:26.752 "is_configured": true, 00:26:26.752 "data_offset": 2048, 00:26:26.752 "data_size": 63488 00:26:26.752 } 00:26:26.752 ] 00:26:26.752 }' 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.752 22:55:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:27.318 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:27.318 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.318 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:27.318 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:27.318 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.318 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.318 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.577 "name": "raid_bdev1", 00:26:27.577 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:27.577 "strip_size_kb": 0, 00:26:27.577 "state": "online", 00:26:27.577 "raid_level": "raid1", 00:26:27.577 "superblock": true, 00:26:27.577 "num_base_bdevs": 4, 00:26:27.577 "num_base_bdevs_discovered": 2, 00:26:27.577 "num_base_bdevs_operational": 2, 00:26:27.577 "base_bdevs_list": [ 00:26:27.577 { 00:26:27.577 "name": null, 00:26:27.577 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.577 "is_configured": false, 00:26:27.577 "data_offset": 2048, 00:26:27.577 "data_size": 63488 00:26:27.577 }, 00:26:27.577 { 00:26:27.577 "name": null, 00:26:27.577 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.577 "is_configured": false, 00:26:27.577 "data_offset": 2048, 00:26:27.577 "data_size": 63488 00:26:27.577 }, 00:26:27.577 { 00:26:27.577 "name": "BaseBdev3", 00:26:27.577 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:27.577 "is_configured": true, 00:26:27.577 "data_offset": 2048, 00:26:27.577 "data_size": 63488 00:26:27.577 }, 00:26:27.577 { 00:26:27.577 "name": "BaseBdev4", 00:26:27.577 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:27.577 "is_configured": true, 00:26:27.577 "data_offset": 2048, 00:26:27.577 "data_size": 63488 00:26:27.577 } 00:26:27.577 ] 00:26:27.577 }' 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:27.577 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:27.836 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:27.836 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:27.836 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:27.836 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:27.836 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:27.836 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:28.095 [2024-07-15 22:55:12.763623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:28.096 [2024-07-15 22:55:12.763773] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:28.096 [2024-07-15 22:55:12.763789] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:28.096 request: 00:26:28.096 { 00:26:28.096 "base_bdev": "BaseBdev1", 00:26:28.096 "raid_bdev": "raid_bdev1", 00:26:28.096 "method": "bdev_raid_add_base_bdev", 00:26:28.096 "req_id": 1 00:26:28.096 } 00:26:28.096 Got JSON-RPC error response 00:26:28.096 response: 00:26:28.096 { 00:26:28.096 "code": -22, 00:26:28.096 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:28.096 } 00:26:28.096 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:26:28.096 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:28.096 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:28.096 22:55:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:28.096 22:55:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.031 22:55:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.289 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.289 "name": "raid_bdev1", 00:26:29.289 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:29.289 "strip_size_kb": 0, 00:26:29.289 "state": "online", 00:26:29.289 "raid_level": "raid1", 00:26:29.289 "superblock": true, 00:26:29.289 "num_base_bdevs": 4, 00:26:29.289 "num_base_bdevs_discovered": 2, 00:26:29.289 "num_base_bdevs_operational": 2, 00:26:29.289 "base_bdevs_list": [ 00:26:29.289 { 00:26:29.289 "name": null, 00:26:29.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.289 "is_configured": false, 00:26:29.289 "data_offset": 2048, 00:26:29.289 "data_size": 63488 00:26:29.289 }, 00:26:29.289 { 00:26:29.289 "name": null, 00:26:29.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.289 "is_configured": false, 00:26:29.289 "data_offset": 2048, 00:26:29.289 "data_size": 63488 00:26:29.289 }, 00:26:29.289 { 00:26:29.289 "name": "BaseBdev3", 00:26:29.289 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:29.289 "is_configured": true, 00:26:29.289 "data_offset": 2048, 00:26:29.289 "data_size": 63488 00:26:29.289 }, 00:26:29.289 { 00:26:29.289 "name": "BaseBdev4", 00:26:29.289 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:29.289 "is_configured": true, 00:26:29.289 "data_offset": 2048, 00:26:29.289 "data_size": 63488 00:26:29.289 } 00:26:29.289 ] 00:26:29.289 }' 00:26:29.289 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.289 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:29.857 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:29.857 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:29.857 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:29.857 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:29.857 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:29.857 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.857 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:30.116 "name": "raid_bdev1", 00:26:30.116 "uuid": "85e3f3ac-2c97-4c19-95f7-60d665a43718", 00:26:30.116 "strip_size_kb": 0, 00:26:30.116 "state": "online", 00:26:30.116 "raid_level": "raid1", 00:26:30.116 "superblock": true, 00:26:30.116 "num_base_bdevs": 4, 00:26:30.116 "num_base_bdevs_discovered": 2, 00:26:30.116 "num_base_bdevs_operational": 2, 00:26:30.116 "base_bdevs_list": [ 00:26:30.116 { 00:26:30.116 "name": null, 00:26:30.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.116 "is_configured": false, 00:26:30.116 "data_offset": 2048, 00:26:30.116 "data_size": 63488 00:26:30.116 }, 00:26:30.116 { 00:26:30.116 "name": null, 00:26:30.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:30.116 "is_configured": false, 00:26:30.116 "data_offset": 2048, 00:26:30.116 "data_size": 63488 00:26:30.116 }, 00:26:30.116 { 00:26:30.116 "name": "BaseBdev3", 00:26:30.116 "uuid": "661561ed-8fda-5c81-8f79-484183e259c4", 00:26:30.116 "is_configured": true, 00:26:30.116 "data_offset": 2048, 00:26:30.116 "data_size": 63488 00:26:30.116 }, 00:26:30.116 { 00:26:30.116 "name": "BaseBdev4", 00:26:30.116 "uuid": "024f2f03-1b65-5a83-b4bc-c3f1e1a5a1d8", 00:26:30.116 "is_configured": true, 00:26:30.116 "data_offset": 2048, 00:26:30.116 "data_size": 63488 00:26:30.116 } 00:26:30.116 ] 00:26:30.116 }' 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2827671 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2827671 ']' 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2827671 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2827671 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2827671' 00:26:30.116 killing process with pid 2827671 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2827671 00:26:30.116 Received shutdown signal, test time was about 60.000000 seconds 00:26:30.116 00:26:30.116 Latency(us) 00:26:30.116 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:30.116 =================================================================================================================== 00:26:30.116 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:30.116 [2024-07-15 22:55:14.950969] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:30.116 [2024-07-15 22:55:14.951074] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:30.116 22:55:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2827671 00:26:30.116 [2024-07-15 22:55:14.951135] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:30.116 [2024-07-15 22:55:14.951148] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2115b10 name raid_bdev1, state offline 00:26:30.116 [2024-07-15 22:55:15.004829] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:30.376 22:55:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:26:30.376 00:26:30.376 real 0m39.171s 00:26:30.376 user 0m55.712s 00:26:30.376 sys 0m7.702s 00:26:30.376 22:55:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:30.376 22:55:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:30.376 ************************************ 00:26:30.376 END TEST raid_rebuild_test_sb 00:26:30.376 ************************************ 00:26:30.376 22:55:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:30.376 22:55:15 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:26:30.376 22:55:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:30.376 22:55:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:30.376 22:55:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:30.635 ************************************ 00:26:30.635 START TEST raid_rebuild_test_io 00:26:30.635 ************************************ 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2833194 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2833194 /var/tmp/spdk-raid.sock 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2833194 ']' 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:30.635 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:30.635 22:55:15 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:30.635 [2024-07-15 22:55:15.391144] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:26:30.635 [2024-07-15 22:55:15.391202] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2833194 ] 00:26:30.635 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:30.635 Zero copy mechanism will not be used. 00:26:30.635 [2024-07-15 22:55:15.503875] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:30.894 [2024-07-15 22:55:15.603332] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:30.894 [2024-07-15 22:55:15.670294] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:30.894 [2024-07-15 22:55:15.670335] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:31.461 22:55:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:31.461 22:55:16 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:26:31.461 22:55:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:31.461 22:55:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:31.719 BaseBdev1_malloc 00:26:31.720 22:55:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:31.978 [2024-07-15 22:55:16.744458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:31.978 [2024-07-15 22:55:16.744511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:31.978 [2024-07-15 22:55:16.744533] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16ded40 00:26:31.978 [2024-07-15 22:55:16.744546] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:31.978 [2024-07-15 22:55:16.746128] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:31.978 [2024-07-15 22:55:16.746159] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:31.978 BaseBdev1 00:26:31.978 22:55:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:31.978 22:55:16 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:32.237 BaseBdev2_malloc 00:26:32.237 22:55:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:32.495 [2024-07-15 22:55:17.242596] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:32.495 [2024-07-15 22:55:17.242643] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:32.495 [2024-07-15 22:55:17.242665] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16df860 00:26:32.495 [2024-07-15 22:55:17.242677] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:32.495 [2024-07-15 22:55:17.244058] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:32.495 [2024-07-15 22:55:17.244094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:32.495 BaseBdev2 00:26:32.495 22:55:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:32.495 22:55:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:32.754 BaseBdev3_malloc 00:26:32.754 22:55:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:33.013 [2024-07-15 22:55:17.728567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:33.013 [2024-07-15 22:55:17.728615] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.013 [2024-07-15 22:55:17.728635] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x188c8f0 00:26:33.013 [2024-07-15 22:55:17.728647] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.013 [2024-07-15 22:55:17.730053] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.013 [2024-07-15 22:55:17.730083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:33.013 BaseBdev3 00:26:33.013 22:55:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:33.013 22:55:17 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:33.271 BaseBdev4_malloc 00:26:33.271 22:55:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:33.530 [2024-07-15 22:55:18.230518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:33.530 [2024-07-15 22:55:18.230564] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.530 [2024-07-15 22:55:18.230583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x188bad0 00:26:33.530 [2024-07-15 22:55:18.230596] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.530 [2024-07-15 22:55:18.231979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.530 [2024-07-15 22:55:18.232009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:33.530 BaseBdev4 00:26:33.530 22:55:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:33.790 spare_malloc 00:26:33.790 22:55:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:34.048 spare_delay 00:26:34.048 22:55:18 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:34.308 [2024-07-15 22:55:18.980975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:34.308 [2024-07-15 22:55:18.981023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.308 [2024-07-15 22:55:18.981044] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18905b0 00:26:34.308 [2024-07-15 22:55:18.981056] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.308 [2024-07-15 22:55:18.982579] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.308 [2024-07-15 22:55:18.982607] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:34.308 spare 00:26:34.308 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:34.567 [2024-07-15 22:55:19.233654] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:34.567 [2024-07-15 22:55:19.234904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:34.567 [2024-07-15 22:55:19.234968] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:34.567 [2024-07-15 22:55:19.235015] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:34.567 [2024-07-15 22:55:19.235097] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x180f8a0 00:26:34.567 [2024-07-15 22:55:19.235107] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:34.567 [2024-07-15 22:55:19.235326] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1889e10 00:26:34.567 [2024-07-15 22:55:19.235477] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x180f8a0 00:26:34.567 [2024-07-15 22:55:19.235487] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x180f8a0 00:26:34.567 [2024-07-15 22:55:19.235596] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.567 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.826 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.826 "name": "raid_bdev1", 00:26:34.826 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:34.826 "strip_size_kb": 0, 00:26:34.826 "state": "online", 00:26:34.826 "raid_level": "raid1", 00:26:34.826 "superblock": false, 00:26:34.826 "num_base_bdevs": 4, 00:26:34.826 "num_base_bdevs_discovered": 4, 00:26:34.826 "num_base_bdevs_operational": 4, 00:26:34.826 "base_bdevs_list": [ 00:26:34.826 { 00:26:34.826 "name": "BaseBdev1", 00:26:34.826 "uuid": "e6c14f33-2852-57ff-9a8a-4eade11bdc3f", 00:26:34.826 "is_configured": true, 00:26:34.826 "data_offset": 0, 00:26:34.826 "data_size": 65536 00:26:34.826 }, 00:26:34.826 { 00:26:34.826 "name": "BaseBdev2", 00:26:34.826 "uuid": "eefe5e3c-46be-5efe-8db3-fa000c478c62", 00:26:34.826 "is_configured": true, 00:26:34.826 "data_offset": 0, 00:26:34.826 "data_size": 65536 00:26:34.826 }, 00:26:34.826 { 00:26:34.826 "name": "BaseBdev3", 00:26:34.827 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:34.827 "is_configured": true, 00:26:34.827 "data_offset": 0, 00:26:34.827 "data_size": 65536 00:26:34.827 }, 00:26:34.827 { 00:26:34.827 "name": "BaseBdev4", 00:26:34.827 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:34.827 "is_configured": true, 00:26:34.827 "data_offset": 0, 00:26:34.827 "data_size": 65536 00:26:34.827 } 00:26:34.827 ] 00:26:34.827 }' 00:26:34.827 22:55:19 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.827 22:55:19 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:35.394 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:35.395 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:35.395 [2024-07-15 22:55:20.272708] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:35.395 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:26:35.395 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.395 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:35.654 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:26:35.654 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:35.654 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:35.654 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:35.913 [2024-07-15 22:55:20.663532] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1815970 00:26:35.913 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:35.913 Zero copy mechanism will not be used. 00:26:35.913 Running I/O for 60 seconds... 00:26:35.913 [2024-07-15 22:55:20.782065] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:35.913 [2024-07-15 22:55:20.782251] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1815970 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.172 22:55:20 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.432 22:55:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:36.432 "name": "raid_bdev1", 00:26:36.432 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:36.432 "strip_size_kb": 0, 00:26:36.432 "state": "online", 00:26:36.432 "raid_level": "raid1", 00:26:36.432 "superblock": false, 00:26:36.432 "num_base_bdevs": 4, 00:26:36.432 "num_base_bdevs_discovered": 3, 00:26:36.432 "num_base_bdevs_operational": 3, 00:26:36.432 "base_bdevs_list": [ 00:26:36.432 { 00:26:36.432 "name": null, 00:26:36.432 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.432 "is_configured": false, 00:26:36.432 "data_offset": 0, 00:26:36.432 "data_size": 65536 00:26:36.432 }, 00:26:36.432 { 00:26:36.432 "name": "BaseBdev2", 00:26:36.432 "uuid": "eefe5e3c-46be-5efe-8db3-fa000c478c62", 00:26:36.432 "is_configured": true, 00:26:36.432 "data_offset": 0, 00:26:36.432 "data_size": 65536 00:26:36.432 }, 00:26:36.432 { 00:26:36.432 "name": "BaseBdev3", 00:26:36.432 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:36.432 "is_configured": true, 00:26:36.432 "data_offset": 0, 00:26:36.432 "data_size": 65536 00:26:36.432 }, 00:26:36.432 { 00:26:36.432 "name": "BaseBdev4", 00:26:36.432 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:36.432 "is_configured": true, 00:26:36.432 "data_offset": 0, 00:26:36.432 "data_size": 65536 00:26:36.432 } 00:26:36.432 ] 00:26:36.432 }' 00:26:36.432 22:55:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:36.432 22:55:21 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:36.999 22:55:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:36.999 [2024-07-15 22:55:21.904424] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:37.284 [2024-07-15 22:55:21.961069] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e5fa0 00:26:37.284 [2024-07-15 22:55:21.963491] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:37.284 22:55:21 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:37.284 [2024-07-15 22:55:22.085011] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:37.284 [2024-07-15 22:55:22.086290] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:37.544 [2024-07-15 22:55:22.341387] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:37.803 [2024-07-15 22:55:22.576046] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:37.803 [2024-07-15 22:55:22.709713] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:38.372 22:55:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:38.372 22:55:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:38.372 22:55:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:38.372 22:55:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:38.373 22:55:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:38.373 22:55:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.373 22:55:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.373 [2024-07-15 22:55:23.199524] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:38.373 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.373 "name": "raid_bdev1", 00:26:38.373 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:38.373 "strip_size_kb": 0, 00:26:38.373 "state": "online", 00:26:38.373 "raid_level": "raid1", 00:26:38.373 "superblock": false, 00:26:38.373 "num_base_bdevs": 4, 00:26:38.373 "num_base_bdevs_discovered": 4, 00:26:38.373 "num_base_bdevs_operational": 4, 00:26:38.373 "process": { 00:26:38.373 "type": "rebuild", 00:26:38.373 "target": "spare", 00:26:38.373 "progress": { 00:26:38.373 "blocks": 16384, 00:26:38.373 "percent": 25 00:26:38.373 } 00:26:38.373 }, 00:26:38.373 "base_bdevs_list": [ 00:26:38.373 { 00:26:38.373 "name": "spare", 00:26:38.373 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:38.373 "is_configured": true, 00:26:38.373 "data_offset": 0, 00:26:38.373 "data_size": 65536 00:26:38.373 }, 00:26:38.373 { 00:26:38.373 "name": "BaseBdev2", 00:26:38.373 "uuid": "eefe5e3c-46be-5efe-8db3-fa000c478c62", 00:26:38.373 "is_configured": true, 00:26:38.373 "data_offset": 0, 00:26:38.373 "data_size": 65536 00:26:38.373 }, 00:26:38.373 { 00:26:38.373 "name": "BaseBdev3", 00:26:38.373 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:38.373 "is_configured": true, 00:26:38.373 "data_offset": 0, 00:26:38.373 "data_size": 65536 00:26:38.373 }, 00:26:38.373 { 00:26:38.373 "name": "BaseBdev4", 00:26:38.373 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:38.373 "is_configured": true, 00:26:38.373 "data_offset": 0, 00:26:38.373 "data_size": 65536 00:26:38.373 } 00:26:38.373 ] 00:26:38.373 }' 00:26:38.373 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:38.631 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:38.631 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:38.631 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:38.631 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:38.890 [2024-07-15 22:55:23.545380] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:38.890 [2024-07-15 22:55:23.665166] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:38.890 [2024-07-15 22:55:23.686849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:38.890 [2024-07-15 22:55:23.686887] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:38.890 [2024-07-15 22:55:23.686899] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:38.890 [2024-07-15 22:55:23.712061] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1815970 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.890 22:55:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.149 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:39.149 "name": "raid_bdev1", 00:26:39.149 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:39.149 "strip_size_kb": 0, 00:26:39.149 "state": "online", 00:26:39.149 "raid_level": "raid1", 00:26:39.149 "superblock": false, 00:26:39.149 "num_base_bdevs": 4, 00:26:39.149 "num_base_bdevs_discovered": 3, 00:26:39.149 "num_base_bdevs_operational": 3, 00:26:39.149 "base_bdevs_list": [ 00:26:39.149 { 00:26:39.149 "name": null, 00:26:39.149 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.149 "is_configured": false, 00:26:39.149 "data_offset": 0, 00:26:39.149 "data_size": 65536 00:26:39.149 }, 00:26:39.149 { 00:26:39.149 "name": "BaseBdev2", 00:26:39.149 "uuid": "eefe5e3c-46be-5efe-8db3-fa000c478c62", 00:26:39.149 "is_configured": true, 00:26:39.149 "data_offset": 0, 00:26:39.149 "data_size": 65536 00:26:39.149 }, 00:26:39.149 { 00:26:39.149 "name": "BaseBdev3", 00:26:39.149 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:39.149 "is_configured": true, 00:26:39.149 "data_offset": 0, 00:26:39.149 "data_size": 65536 00:26:39.149 }, 00:26:39.149 { 00:26:39.149 "name": "BaseBdev4", 00:26:39.149 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:39.149 "is_configured": true, 00:26:39.149 "data_offset": 0, 00:26:39.149 "data_size": 65536 00:26:39.149 } 00:26:39.149 ] 00:26:39.149 }' 00:26:39.149 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:39.149 22:55:24 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.084 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.084 "name": "raid_bdev1", 00:26:40.084 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:40.084 "strip_size_kb": 0, 00:26:40.084 "state": "online", 00:26:40.084 "raid_level": "raid1", 00:26:40.084 "superblock": false, 00:26:40.084 "num_base_bdevs": 4, 00:26:40.084 "num_base_bdevs_discovered": 3, 00:26:40.084 "num_base_bdevs_operational": 3, 00:26:40.084 "base_bdevs_list": [ 00:26:40.084 { 00:26:40.084 "name": null, 00:26:40.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.085 "is_configured": false, 00:26:40.085 "data_offset": 0, 00:26:40.085 "data_size": 65536 00:26:40.085 }, 00:26:40.085 { 00:26:40.085 "name": "BaseBdev2", 00:26:40.085 "uuid": "eefe5e3c-46be-5efe-8db3-fa000c478c62", 00:26:40.085 "is_configured": true, 00:26:40.085 "data_offset": 0, 00:26:40.085 "data_size": 65536 00:26:40.085 }, 00:26:40.085 { 00:26:40.085 "name": "BaseBdev3", 00:26:40.085 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:40.085 "is_configured": true, 00:26:40.085 "data_offset": 0, 00:26:40.085 "data_size": 65536 00:26:40.085 }, 00:26:40.085 { 00:26:40.085 "name": "BaseBdev4", 00:26:40.085 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:40.085 "is_configured": true, 00:26:40.085 "data_offset": 0, 00:26:40.085 "data_size": 65536 00:26:40.085 } 00:26:40.085 ] 00:26:40.085 }' 00:26:40.085 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.085 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:40.085 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:40.085 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:40.085 22:55:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:40.357 [2024-07-15 22:55:25.219042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:40.616 22:55:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:40.616 [2024-07-15 22:55:25.284569] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18838f0 00:26:40.616 [2024-07-15 22:55:25.286114] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:40.616 [2024-07-15 22:55:25.414622] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:40.874 [2024-07-15 22:55:25.556292] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:40.874 [2024-07-15 22:55:25.556476] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:41.132 [2024-07-15 22:55:25.883705] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:41.391 [2024-07-15 22:55:26.044150] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:41.391 [2024-07-15 22:55:26.044720] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:41.391 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:41.391 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.391 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:41.391 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:41.391 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.391 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.391 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.651 [2024-07-15 22:55:26.454282] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:41.651 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:41.651 "name": "raid_bdev1", 00:26:41.651 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:41.651 "strip_size_kb": 0, 00:26:41.651 "state": "online", 00:26:41.651 "raid_level": "raid1", 00:26:41.651 "superblock": false, 00:26:41.651 "num_base_bdevs": 4, 00:26:41.651 "num_base_bdevs_discovered": 4, 00:26:41.651 "num_base_bdevs_operational": 4, 00:26:41.651 "process": { 00:26:41.651 "type": "rebuild", 00:26:41.651 "target": "spare", 00:26:41.651 "progress": { 00:26:41.651 "blocks": 14336, 00:26:41.651 "percent": 21 00:26:41.651 } 00:26:41.651 }, 00:26:41.651 "base_bdevs_list": [ 00:26:41.651 { 00:26:41.651 "name": "spare", 00:26:41.651 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:41.651 "is_configured": true, 00:26:41.651 "data_offset": 0, 00:26:41.651 "data_size": 65536 00:26:41.651 }, 00:26:41.651 { 00:26:41.651 "name": "BaseBdev2", 00:26:41.651 "uuid": "eefe5e3c-46be-5efe-8db3-fa000c478c62", 00:26:41.651 "is_configured": true, 00:26:41.651 "data_offset": 0, 00:26:41.651 "data_size": 65536 00:26:41.651 }, 00:26:41.651 { 00:26:41.651 "name": "BaseBdev3", 00:26:41.651 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:41.651 "is_configured": true, 00:26:41.651 "data_offset": 0, 00:26:41.651 "data_size": 65536 00:26:41.651 }, 00:26:41.651 { 00:26:41.651 "name": "BaseBdev4", 00:26:41.651 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:41.651 "is_configured": true, 00:26:41.651 "data_offset": 0, 00:26:41.651 "data_size": 65536 00:26:41.651 } 00:26:41.651 ] 00:26:41.651 }' 00:26:41.651 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.910 [2024-07-15 22:55:26.580423] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:41.910 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:42.168 [2024-07-15 22:55:26.856858] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:42.168 [2024-07-15 22:55:26.936121] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1815970 00:26:42.168 [2024-07-15 22:55:26.936148] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x18838f0 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.168 22:55:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.168 [2024-07-15 22:55:27.071128] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:42.427 "name": "raid_bdev1", 00:26:42.427 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:42.427 "strip_size_kb": 0, 00:26:42.427 "state": "online", 00:26:42.427 "raid_level": "raid1", 00:26:42.427 "superblock": false, 00:26:42.427 "num_base_bdevs": 4, 00:26:42.427 "num_base_bdevs_discovered": 3, 00:26:42.427 "num_base_bdevs_operational": 3, 00:26:42.427 "process": { 00:26:42.427 "type": "rebuild", 00:26:42.427 "target": "spare", 00:26:42.427 "progress": { 00:26:42.427 "blocks": 22528, 00:26:42.427 "percent": 34 00:26:42.427 } 00:26:42.427 }, 00:26:42.427 "base_bdevs_list": [ 00:26:42.427 { 00:26:42.427 "name": "spare", 00:26:42.427 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:42.427 "is_configured": true, 00:26:42.427 "data_offset": 0, 00:26:42.427 "data_size": 65536 00:26:42.427 }, 00:26:42.427 { 00:26:42.427 "name": null, 00:26:42.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.427 "is_configured": false, 00:26:42.427 "data_offset": 0, 00:26:42.427 "data_size": 65536 00:26:42.427 }, 00:26:42.427 { 00:26:42.427 "name": "BaseBdev3", 00:26:42.427 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:42.427 "is_configured": true, 00:26:42.427 "data_offset": 0, 00:26:42.427 "data_size": 65536 00:26:42.427 }, 00:26:42.427 { 00:26:42.427 "name": "BaseBdev4", 00:26:42.427 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:42.427 "is_configured": true, 00:26:42.427 "data_offset": 0, 00:26:42.427 "data_size": 65536 00:26:42.427 } 00:26:42.427 ] 00:26:42.427 }' 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=975 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:42.427 [2024-07-15 22:55:27.314173] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:42.427 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:42.428 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.428 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.686 [2024-07-15 22:55:27.537400] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:42.686 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:42.686 "name": "raid_bdev1", 00:26:42.686 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:42.686 "strip_size_kb": 0, 00:26:42.686 "state": "online", 00:26:42.686 "raid_level": "raid1", 00:26:42.686 "superblock": false, 00:26:42.686 "num_base_bdevs": 4, 00:26:42.686 "num_base_bdevs_discovered": 3, 00:26:42.686 "num_base_bdevs_operational": 3, 00:26:42.686 "process": { 00:26:42.686 "type": "rebuild", 00:26:42.686 "target": "spare", 00:26:42.686 "progress": { 00:26:42.686 "blocks": 28672, 00:26:42.686 "percent": 43 00:26:42.686 } 00:26:42.686 }, 00:26:42.686 "base_bdevs_list": [ 00:26:42.686 { 00:26:42.686 "name": "spare", 00:26:42.686 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:42.686 "is_configured": true, 00:26:42.686 "data_offset": 0, 00:26:42.686 "data_size": 65536 00:26:42.686 }, 00:26:42.686 { 00:26:42.686 "name": null, 00:26:42.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.686 "is_configured": false, 00:26:42.686 "data_offset": 0, 00:26:42.687 "data_size": 65536 00:26:42.687 }, 00:26:42.687 { 00:26:42.687 "name": "BaseBdev3", 00:26:42.687 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:42.687 "is_configured": true, 00:26:42.687 "data_offset": 0, 00:26:42.687 "data_size": 65536 00:26:42.687 }, 00:26:42.687 { 00:26:42.687 "name": "BaseBdev4", 00:26:42.687 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:42.687 "is_configured": true, 00:26:42.687 "data_offset": 0, 00:26:42.687 "data_size": 65536 00:26:42.687 } 00:26:42.687 ] 00:26:42.687 }' 00:26:42.687 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:42.946 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:42.946 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:42.946 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:42.946 22:55:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:43.204 [2024-07-15 22:55:27.895073] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:43.204 [2024-07-15 22:55:28.006589] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:43.772 [2024-07-15 22:55:28.380252] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.772 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.031 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:44.031 "name": "raid_bdev1", 00:26:44.031 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:44.031 "strip_size_kb": 0, 00:26:44.031 "state": "online", 00:26:44.031 "raid_level": "raid1", 00:26:44.031 "superblock": false, 00:26:44.031 "num_base_bdevs": 4, 00:26:44.031 "num_base_bdevs_discovered": 3, 00:26:44.031 "num_base_bdevs_operational": 3, 00:26:44.031 "process": { 00:26:44.031 "type": "rebuild", 00:26:44.031 "target": "spare", 00:26:44.031 "progress": { 00:26:44.031 "blocks": 47104, 00:26:44.031 "percent": 71 00:26:44.031 } 00:26:44.031 }, 00:26:44.031 "base_bdevs_list": [ 00:26:44.031 { 00:26:44.031 "name": "spare", 00:26:44.031 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:44.031 "is_configured": true, 00:26:44.031 "data_offset": 0, 00:26:44.031 "data_size": 65536 00:26:44.031 }, 00:26:44.031 { 00:26:44.031 "name": null, 00:26:44.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.031 "is_configured": false, 00:26:44.031 "data_offset": 0, 00:26:44.031 "data_size": 65536 00:26:44.031 }, 00:26:44.031 { 00:26:44.031 "name": "BaseBdev3", 00:26:44.031 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:44.031 "is_configured": true, 00:26:44.031 "data_offset": 0, 00:26:44.031 "data_size": 65536 00:26:44.031 }, 00:26:44.031 { 00:26:44.031 "name": "BaseBdev4", 00:26:44.031 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:44.031 "is_configured": true, 00:26:44.031 "data_offset": 0, 00:26:44.031 "data_size": 65536 00:26:44.031 } 00:26:44.031 ] 00:26:44.031 }' 00:26:44.031 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:44.031 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:44.031 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:44.290 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:44.290 22:55:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:44.290 [2024-07-15 22:55:29.045664] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:44.550 [2024-07-15 22:55:29.371931] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:45.117 [2024-07-15 22:55:29.828979] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:45.117 [2024-07-15 22:55:29.926400] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:45.117 [2024-07-15 22:55:29.928772] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.117 22:55:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.376 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:45.376 "name": "raid_bdev1", 00:26:45.376 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:45.376 "strip_size_kb": 0, 00:26:45.376 "state": "online", 00:26:45.376 "raid_level": "raid1", 00:26:45.376 "superblock": false, 00:26:45.376 "num_base_bdevs": 4, 00:26:45.376 "num_base_bdevs_discovered": 3, 00:26:45.376 "num_base_bdevs_operational": 3, 00:26:45.376 "base_bdevs_list": [ 00:26:45.376 { 00:26:45.376 "name": "spare", 00:26:45.376 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:45.376 "is_configured": true, 00:26:45.376 "data_offset": 0, 00:26:45.376 "data_size": 65536 00:26:45.376 }, 00:26:45.376 { 00:26:45.376 "name": null, 00:26:45.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.376 "is_configured": false, 00:26:45.376 "data_offset": 0, 00:26:45.376 "data_size": 65536 00:26:45.376 }, 00:26:45.376 { 00:26:45.376 "name": "BaseBdev3", 00:26:45.376 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:45.376 "is_configured": true, 00:26:45.376 "data_offset": 0, 00:26:45.376 "data_size": 65536 00:26:45.376 }, 00:26:45.376 { 00:26:45.376 "name": "BaseBdev4", 00:26:45.376 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:45.376 "is_configured": true, 00:26:45.376 "data_offset": 0, 00:26:45.376 "data_size": 65536 00:26:45.376 } 00:26:45.376 ] 00:26:45.376 }' 00:26:45.376 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.376 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:45.376 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.635 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.893 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:45.893 "name": "raid_bdev1", 00:26:45.893 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:45.893 "strip_size_kb": 0, 00:26:45.893 "state": "online", 00:26:45.893 "raid_level": "raid1", 00:26:45.893 "superblock": false, 00:26:45.893 "num_base_bdevs": 4, 00:26:45.893 "num_base_bdevs_discovered": 3, 00:26:45.893 "num_base_bdevs_operational": 3, 00:26:45.893 "base_bdevs_list": [ 00:26:45.893 { 00:26:45.893 "name": "spare", 00:26:45.893 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:45.893 "is_configured": true, 00:26:45.893 "data_offset": 0, 00:26:45.893 "data_size": 65536 00:26:45.893 }, 00:26:45.893 { 00:26:45.893 "name": null, 00:26:45.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.893 "is_configured": false, 00:26:45.893 "data_offset": 0, 00:26:45.894 "data_size": 65536 00:26:45.894 }, 00:26:45.894 { 00:26:45.894 "name": "BaseBdev3", 00:26:45.894 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:45.894 "is_configured": true, 00:26:45.894 "data_offset": 0, 00:26:45.894 "data_size": 65536 00:26:45.894 }, 00:26:45.894 { 00:26:45.894 "name": "BaseBdev4", 00:26:45.894 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:45.894 "is_configured": true, 00:26:45.894 "data_offset": 0, 00:26:45.894 "data_size": 65536 00:26:45.894 } 00:26:45.894 ] 00:26:45.894 }' 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.894 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.153 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:46.153 "name": "raid_bdev1", 00:26:46.153 "uuid": "c16648a2-bb6e-40f6-baa6-7e564dd232c0", 00:26:46.153 "strip_size_kb": 0, 00:26:46.153 "state": "online", 00:26:46.153 "raid_level": "raid1", 00:26:46.153 "superblock": false, 00:26:46.153 "num_base_bdevs": 4, 00:26:46.153 "num_base_bdevs_discovered": 3, 00:26:46.153 "num_base_bdevs_operational": 3, 00:26:46.153 "base_bdevs_list": [ 00:26:46.153 { 00:26:46.153 "name": "spare", 00:26:46.153 "uuid": "b32d6987-75e3-55c3-9077-5015db04f834", 00:26:46.153 "is_configured": true, 00:26:46.153 "data_offset": 0, 00:26:46.153 "data_size": 65536 00:26:46.153 }, 00:26:46.153 { 00:26:46.153 "name": null, 00:26:46.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.153 "is_configured": false, 00:26:46.153 "data_offset": 0, 00:26:46.153 "data_size": 65536 00:26:46.153 }, 00:26:46.153 { 00:26:46.153 "name": "BaseBdev3", 00:26:46.153 "uuid": "5de50c2f-4e46-5bba-944b-bd515c9c3adc", 00:26:46.153 "is_configured": true, 00:26:46.153 "data_offset": 0, 00:26:46.153 "data_size": 65536 00:26:46.153 }, 00:26:46.153 { 00:26:46.153 "name": "BaseBdev4", 00:26:46.153 "uuid": "747a0705-788d-53ac-9be4-7f9163265839", 00:26:46.153 "is_configured": true, 00:26:46.153 "data_offset": 0, 00:26:46.153 "data_size": 65536 00:26:46.153 } 00:26:46.153 ] 00:26:46.153 }' 00:26:46.153 22:55:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:46.153 22:55:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:46.721 22:55:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:46.979 [2024-07-15 22:55:31.805811] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:46.979 [2024-07-15 22:55:31.805847] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:47.239 00:26:47.239 Latency(us) 00:26:47.239 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:47.239 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:47.239 raid_bdev1 : 11.22 91.07 273.20 0.00 0.00 14963.58 309.87 124005.51 00:26:47.239 =================================================================================================================== 00:26:47.239 Total : 91.07 273.20 0.00 0.00 14963.58 309.87 124005.51 00:26:47.239 [2024-07-15 22:55:31.919907] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:47.239 [2024-07-15 22:55:31.919950] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:47.239 [2024-07-15 22:55:31.920044] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:47.239 [2024-07-15 22:55:31.920056] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x180f8a0 name raid_bdev1, state offline 00:26:47.239 0 00:26:47.239 22:55:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.239 22:55:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:47.498 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:47.758 /dev/nbd0 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:47.758 1+0 records in 00:26:47.758 1+0 records out 00:26:47.758 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290682 s, 14.1 MB/s 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:47.758 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:48.016 /dev/nbd1 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:48.016 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:48.016 1+0 records in 00:26:48.016 1+0 records out 00:26:48.017 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299737 s, 13.7 MB/s 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:48.017 22:55:32 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.275 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:48.533 /dev/nbd1 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:48.533 1+0 records in 00:26:48.533 1+0 records out 00:26:48.533 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263262 s, 15.6 MB/s 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:48.533 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:48.790 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:49.048 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:49.048 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:49.048 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:49.048 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2833194 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2833194 ']' 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2833194 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:49.049 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2833194 00:26:49.307 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:49.307 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:49.307 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2833194' 00:26:49.307 killing process with pid 2833194 00:26:49.307 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2833194 00:26:49.307 Received shutdown signal, test time was about 13.262700 seconds 00:26:49.307 00:26:49.307 Latency(us) 00:26:49.307 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:49.307 =================================================================================================================== 00:26:49.307 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:49.307 [2024-07-15 22:55:33.961071] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:49.307 22:55:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2833194 00:26:49.307 [2024-07-15 22:55:34.004139] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:49.566 22:55:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:49.566 00:26:49.566 real 0m18.914s 00:26:49.566 user 0m29.339s 00:26:49.566 sys 0m3.391s 00:26:49.566 22:55:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:49.566 22:55:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:49.566 ************************************ 00:26:49.566 END TEST raid_rebuild_test_io 00:26:49.566 ************************************ 00:26:49.566 22:55:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:49.566 22:55:34 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:26:49.567 22:55:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:49.567 22:55:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:49.567 22:55:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:49.567 ************************************ 00:26:49.567 START TEST raid_rebuild_test_sb_io 00:26:49.567 ************************************ 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2835859 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2835859 /var/tmp/spdk-raid.sock 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2835859 ']' 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:49.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:49.567 22:55:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:49.567 [2024-07-15 22:55:34.404978] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:26:49.567 [2024-07-15 22:55:34.405061] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2835859 ] 00:26:49.567 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:49.567 Zero copy mechanism will not be used. 00:26:49.826 [2024-07-15 22:55:34.537045] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:49.826 [2024-07-15 22:55:34.643273] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:49.826 [2024-07-15 22:55:34.708258] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:49.826 [2024-07-15 22:55:34.708299] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:50.392 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:50.392 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:26:50.392 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:50.392 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:50.650 BaseBdev1_malloc 00:26:50.650 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:50.650 [2024-07-15 22:55:35.541078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:50.650 [2024-07-15 22:55:35.541126] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:50.650 [2024-07-15 22:55:35.541149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10a7d40 00:26:50.650 [2024-07-15 22:55:35.541161] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:50.650 [2024-07-15 22:55:35.542772] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:50.650 [2024-07-15 22:55:35.542802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:50.650 BaseBdev1 00:26:50.908 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:50.908 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:50.908 BaseBdev2_malloc 00:26:51.166 22:55:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:51.166 [2024-07-15 22:55:35.979189] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:51.166 [2024-07-15 22:55:35.979236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:51.166 [2024-07-15 22:55:35.979259] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10a8860 00:26:51.166 [2024-07-15 22:55:35.979271] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:51.166 [2024-07-15 22:55:35.980721] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:51.166 [2024-07-15 22:55:35.980750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:51.166 BaseBdev2 00:26:51.166 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:51.166 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:51.457 BaseBdev3_malloc 00:26:51.457 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:51.457 [2024-07-15 22:55:36.328656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:51.457 [2024-07-15 22:55:36.328705] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:51.457 [2024-07-15 22:55:36.328724] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12558f0 00:26:51.457 [2024-07-15 22:55:36.328736] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:51.457 [2024-07-15 22:55:36.330107] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:51.458 [2024-07-15 22:55:36.330134] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:51.458 BaseBdev3 00:26:51.458 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:51.458 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:51.716 BaseBdev4_malloc 00:26:51.716 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:51.974 [2024-07-15 22:55:36.686126] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:51.974 [2024-07-15 22:55:36.686168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:51.974 [2024-07-15 22:55:36.686187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1254ad0 00:26:51.974 [2024-07-15 22:55:36.686199] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:51.974 [2024-07-15 22:55:36.687550] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:51.974 [2024-07-15 22:55:36.687578] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:51.974 BaseBdev4 00:26:51.974 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:51.974 spare_malloc 00:26:52.232 22:55:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:52.232 spare_delay 00:26:52.232 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:52.490 [2024-07-15 22:55:37.220021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:52.490 [2024-07-15 22:55:37.220064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:52.490 [2024-07-15 22:55:37.220082] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12595b0 00:26:52.490 [2024-07-15 22:55:37.220100] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:52.490 [2024-07-15 22:55:37.221511] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:52.490 [2024-07-15 22:55:37.221541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:52.490 spare 00:26:52.490 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:52.490 [2024-07-15 22:55:37.392517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:52.490 [2024-07-15 22:55:37.393730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:52.490 [2024-07-15 22:55:37.393782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:52.490 [2024-07-15 22:55:37.393827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:52.490 [2024-07-15 22:55:37.394027] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11d88a0 00:26:52.490 [2024-07-15 22:55:37.394039] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:52.490 [2024-07-15 22:55:37.394222] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1252e10 00:26:52.490 [2024-07-15 22:55:37.394367] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11d88a0 00:26:52.490 [2024-07-15 22:55:37.394378] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11d88a0 00:26:52.490 [2024-07-15 22:55:37.394465] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:52.748 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.749 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.007 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.007 "name": "raid_bdev1", 00:26:53.007 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:26:53.007 "strip_size_kb": 0, 00:26:53.007 "state": "online", 00:26:53.007 "raid_level": "raid1", 00:26:53.007 "superblock": true, 00:26:53.007 "num_base_bdevs": 4, 00:26:53.007 "num_base_bdevs_discovered": 4, 00:26:53.007 "num_base_bdevs_operational": 4, 00:26:53.007 "base_bdevs_list": [ 00:26:53.007 { 00:26:53.007 "name": "BaseBdev1", 00:26:53.007 "uuid": "0d60a540-1923-5157-9600-76f858bb0482", 00:26:53.007 "is_configured": true, 00:26:53.007 "data_offset": 2048, 00:26:53.007 "data_size": 63488 00:26:53.007 }, 00:26:53.007 { 00:26:53.007 "name": "BaseBdev2", 00:26:53.008 "uuid": "4505287b-305f-58fd-814b-d6ae884da56f", 00:26:53.008 "is_configured": true, 00:26:53.008 "data_offset": 2048, 00:26:53.008 "data_size": 63488 00:26:53.008 }, 00:26:53.008 { 00:26:53.008 "name": "BaseBdev3", 00:26:53.008 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:26:53.008 "is_configured": true, 00:26:53.008 "data_offset": 2048, 00:26:53.008 "data_size": 63488 00:26:53.008 }, 00:26:53.008 { 00:26:53.008 "name": "BaseBdev4", 00:26:53.008 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:26:53.008 "is_configured": true, 00:26:53.008 "data_offset": 2048, 00:26:53.008 "data_size": 63488 00:26:53.008 } 00:26:53.008 ] 00:26:53.008 }' 00:26:53.008 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.008 22:55:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:53.574 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:53.574 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:53.831 [2024-07-15 22:55:38.483804] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:53.831 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:53.831 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.831 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:54.088 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:54.089 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:54.089 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:54.089 22:55:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:54.089 [2024-07-15 22:55:38.866626] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10a7670 00:26:54.089 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:54.089 Zero copy mechanism will not be used. 00:26:54.089 Running I/O for 60 seconds... 00:26:54.089 [2024-07-15 22:55:38.982747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:54.347 [2024-07-15 22:55:38.999003] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x10a7670 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.347 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.605 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:54.605 "name": "raid_bdev1", 00:26:54.605 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:26:54.605 "strip_size_kb": 0, 00:26:54.605 "state": "online", 00:26:54.605 "raid_level": "raid1", 00:26:54.605 "superblock": true, 00:26:54.605 "num_base_bdevs": 4, 00:26:54.605 "num_base_bdevs_discovered": 3, 00:26:54.605 "num_base_bdevs_operational": 3, 00:26:54.605 "base_bdevs_list": [ 00:26:54.605 { 00:26:54.605 "name": null, 00:26:54.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.605 "is_configured": false, 00:26:54.605 "data_offset": 2048, 00:26:54.605 "data_size": 63488 00:26:54.605 }, 00:26:54.605 { 00:26:54.605 "name": "BaseBdev2", 00:26:54.605 "uuid": "4505287b-305f-58fd-814b-d6ae884da56f", 00:26:54.605 "is_configured": true, 00:26:54.605 "data_offset": 2048, 00:26:54.605 "data_size": 63488 00:26:54.605 }, 00:26:54.605 { 00:26:54.606 "name": "BaseBdev3", 00:26:54.606 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:26:54.606 "is_configured": true, 00:26:54.606 "data_offset": 2048, 00:26:54.606 "data_size": 63488 00:26:54.606 }, 00:26:54.606 { 00:26:54.606 "name": "BaseBdev4", 00:26:54.606 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:26:54.606 "is_configured": true, 00:26:54.606 "data_offset": 2048, 00:26:54.606 "data_size": 63488 00:26:54.606 } 00:26:54.606 ] 00:26:54.606 }' 00:26:54.606 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:54.606 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:55.172 22:55:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:55.430 [2024-07-15 22:55:40.154427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:55.430 22:55:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:55.430 [2024-07-15 22:55:40.211038] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11dab40 00:26:55.430 [2024-07-15 22:55:40.213447] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:55.687 [2024-07-15 22:55:40.458306] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:55.687 [2024-07-15 22:55:40.458498] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:55.944 [2024-07-15 22:55:40.713874] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:55.944 [2024-07-15 22:55:40.835570] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:55.944 [2024-07-15 22:55:40.835829] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:56.202 [2024-07-15 22:55:41.081590] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:56.461 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:56.461 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:56.461 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:56.461 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:56.461 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:56.461 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.461 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.720 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.720 "name": "raid_bdev1", 00:26:56.720 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:26:56.720 "strip_size_kb": 0, 00:26:56.720 "state": "online", 00:26:56.720 "raid_level": "raid1", 00:26:56.720 "superblock": true, 00:26:56.720 "num_base_bdevs": 4, 00:26:56.720 "num_base_bdevs_discovered": 4, 00:26:56.720 "num_base_bdevs_operational": 4, 00:26:56.720 "process": { 00:26:56.720 "type": "rebuild", 00:26:56.720 "target": "spare", 00:26:56.720 "progress": { 00:26:56.720 "blocks": 16384, 00:26:56.720 "percent": 25 00:26:56.720 } 00:26:56.720 }, 00:26:56.720 "base_bdevs_list": [ 00:26:56.720 { 00:26:56.720 "name": "spare", 00:26:56.720 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:26:56.720 "is_configured": true, 00:26:56.720 "data_offset": 2048, 00:26:56.720 "data_size": 63488 00:26:56.720 }, 00:26:56.720 { 00:26:56.720 "name": "BaseBdev2", 00:26:56.720 "uuid": "4505287b-305f-58fd-814b-d6ae884da56f", 00:26:56.720 "is_configured": true, 00:26:56.720 "data_offset": 2048, 00:26:56.720 "data_size": 63488 00:26:56.720 }, 00:26:56.720 { 00:26:56.720 "name": "BaseBdev3", 00:26:56.720 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:26:56.720 "is_configured": true, 00:26:56.720 "data_offset": 2048, 00:26:56.720 "data_size": 63488 00:26:56.720 }, 00:26:56.720 { 00:26:56.720 "name": "BaseBdev4", 00:26:56.720 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:26:56.720 "is_configured": true, 00:26:56.720 "data_offset": 2048, 00:26:56.720 "data_size": 63488 00:26:56.720 } 00:26:56.720 ] 00:26:56.720 }' 00:26:56.720 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.720 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:56.720 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.720 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:56.720 22:55:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:56.720 [2024-07-15 22:55:41.584388] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:26:56.986 [2024-07-15 22:55:41.723810] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:56.986 [2024-07-15 22:55:41.795332] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.245 [2024-07-15 22:55:41.955611] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:57.245 [2024-07-15 22:55:41.967924] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.245 [2024-07-15 22:55:41.967965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.245 [2024-07-15 22:55:41.967977] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:57.245 [2024-07-15 22:55:41.983805] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x10a7670 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.245 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.505 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.505 "name": "raid_bdev1", 00:26:57.505 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:26:57.505 "strip_size_kb": 0, 00:26:57.505 "state": "online", 00:26:57.505 "raid_level": "raid1", 00:26:57.505 "superblock": true, 00:26:57.505 "num_base_bdevs": 4, 00:26:57.505 "num_base_bdevs_discovered": 3, 00:26:57.505 "num_base_bdevs_operational": 3, 00:26:57.505 "base_bdevs_list": [ 00:26:57.505 { 00:26:57.505 "name": null, 00:26:57.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.505 "is_configured": false, 00:26:57.505 "data_offset": 2048, 00:26:57.505 "data_size": 63488 00:26:57.505 }, 00:26:57.505 { 00:26:57.505 "name": "BaseBdev2", 00:26:57.505 "uuid": "4505287b-305f-58fd-814b-d6ae884da56f", 00:26:57.505 "is_configured": true, 00:26:57.505 "data_offset": 2048, 00:26:57.505 "data_size": 63488 00:26:57.505 }, 00:26:57.505 { 00:26:57.505 "name": "BaseBdev3", 00:26:57.505 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:26:57.505 "is_configured": true, 00:26:57.505 "data_offset": 2048, 00:26:57.505 "data_size": 63488 00:26:57.505 }, 00:26:57.505 { 00:26:57.505 "name": "BaseBdev4", 00:26:57.505 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:26:57.505 "is_configured": true, 00:26:57.505 "data_offset": 2048, 00:26:57.505 "data_size": 63488 00:26:57.505 } 00:26:57.505 ] 00:26:57.505 }' 00:26:57.505 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.505 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:58.096 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:58.096 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:58.096 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:58.096 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:58.096 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:58.096 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.096 22:55:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.355 22:55:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:58.355 "name": "raid_bdev1", 00:26:58.355 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:26:58.355 "strip_size_kb": 0, 00:26:58.355 "state": "online", 00:26:58.355 "raid_level": "raid1", 00:26:58.355 "superblock": true, 00:26:58.355 "num_base_bdevs": 4, 00:26:58.355 "num_base_bdevs_discovered": 3, 00:26:58.355 "num_base_bdevs_operational": 3, 00:26:58.355 "base_bdevs_list": [ 00:26:58.355 { 00:26:58.355 "name": null, 00:26:58.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.355 "is_configured": false, 00:26:58.355 "data_offset": 2048, 00:26:58.355 "data_size": 63488 00:26:58.355 }, 00:26:58.356 { 00:26:58.356 "name": "BaseBdev2", 00:26:58.356 "uuid": "4505287b-305f-58fd-814b-d6ae884da56f", 00:26:58.356 "is_configured": true, 00:26:58.356 "data_offset": 2048, 00:26:58.356 "data_size": 63488 00:26:58.356 }, 00:26:58.356 { 00:26:58.356 "name": "BaseBdev3", 00:26:58.356 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:26:58.356 "is_configured": true, 00:26:58.356 "data_offset": 2048, 00:26:58.356 "data_size": 63488 00:26:58.356 }, 00:26:58.356 { 00:26:58.356 "name": "BaseBdev4", 00:26:58.356 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:26:58.356 "is_configured": true, 00:26:58.356 "data_offset": 2048, 00:26:58.356 "data_size": 63488 00:26:58.356 } 00:26:58.356 ] 00:26:58.356 }' 00:26:58.356 22:55:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:58.356 22:55:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:58.356 22:55:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:58.614 22:55:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:58.614 22:55:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:58.614 [2024-07-15 22:55:43.506473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:58.875 22:55:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:58.875 [2024-07-15 22:55:43.580500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11dc8f0 00:26:58.875 [2024-07-15 22:55:43.582036] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:58.875 [2024-07-15 22:55:43.722713] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:59.133 [2024-07-15 22:55:43.957029] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:59.133 [2024-07-15 22:55:43.957729] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:59.700 [2024-07-15 22:55:44.305129] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:59.700 [2024-07-15 22:55:44.438646] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:59.700 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:59.700 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:59.700 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:59.700 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:59.700 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:59.700 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.700 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.958 [2024-07-15 22:55:44.674306] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:59.958 [2024-07-15 22:55:44.675525] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:59.958 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:59.958 "name": "raid_bdev1", 00:26:59.958 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:26:59.958 "strip_size_kb": 0, 00:26:59.958 "state": "online", 00:26:59.958 "raid_level": "raid1", 00:26:59.958 "superblock": true, 00:26:59.958 "num_base_bdevs": 4, 00:26:59.958 "num_base_bdevs_discovered": 4, 00:26:59.958 "num_base_bdevs_operational": 4, 00:26:59.958 "process": { 00:26:59.958 "type": "rebuild", 00:26:59.958 "target": "spare", 00:26:59.958 "progress": { 00:26:59.958 "blocks": 14336, 00:26:59.958 "percent": 22 00:26:59.958 } 00:26:59.958 }, 00:26:59.958 "base_bdevs_list": [ 00:26:59.958 { 00:26:59.958 "name": "spare", 00:26:59.958 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:26:59.958 "is_configured": true, 00:26:59.958 "data_offset": 2048, 00:26:59.958 "data_size": 63488 00:26:59.958 }, 00:26:59.958 { 00:26:59.958 "name": "BaseBdev2", 00:26:59.958 "uuid": "4505287b-305f-58fd-814b-d6ae884da56f", 00:26:59.958 "is_configured": true, 00:26:59.958 "data_offset": 2048, 00:26:59.958 "data_size": 63488 00:26:59.958 }, 00:26:59.958 { 00:26:59.958 "name": "BaseBdev3", 00:26:59.958 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:26:59.958 "is_configured": true, 00:26:59.958 "data_offset": 2048, 00:26:59.958 "data_size": 63488 00:26:59.958 }, 00:26:59.958 { 00:26:59.958 "name": "BaseBdev4", 00:26:59.958 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:26:59.958 "is_configured": true, 00:26:59.958 "data_offset": 2048, 00:26:59.958 "data_size": 63488 00:26:59.958 } 00:26:59.958 ] 00:26:59.958 }' 00:26:59.958 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:59.958 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:59.958 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:00.216 [2024-07-15 22:55:44.886112] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:00.216 [2024-07-15 22:55:44.886306] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:00.216 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:00.216 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:00.216 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:00.216 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:00.216 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:27:00.216 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:00.216 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:27:00.216 22:55:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:00.216 [2024-07-15 22:55:45.120479] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:00.475 [2024-07-15 22:55:45.210577] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:00.733 [2024-07-15 22:55:45.413060] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x10a7670 00:27:00.733 [2024-07-15 22:55:45.413090] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x11dc8f0 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.733 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:00.992 "name": "raid_bdev1", 00:27:00.992 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:00.992 "strip_size_kb": 0, 00:27:00.992 "state": "online", 00:27:00.992 "raid_level": "raid1", 00:27:00.992 "superblock": true, 00:27:00.992 "num_base_bdevs": 4, 00:27:00.992 "num_base_bdevs_discovered": 3, 00:27:00.992 "num_base_bdevs_operational": 3, 00:27:00.992 "process": { 00:27:00.992 "type": "rebuild", 00:27:00.992 "target": "spare", 00:27:00.992 "progress": { 00:27:00.992 "blocks": 24576, 00:27:00.992 "percent": 38 00:27:00.992 } 00:27:00.992 }, 00:27:00.992 "base_bdevs_list": [ 00:27:00.992 { 00:27:00.992 "name": "spare", 00:27:00.992 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:00.992 "is_configured": true, 00:27:00.992 "data_offset": 2048, 00:27:00.992 "data_size": 63488 00:27:00.992 }, 00:27:00.992 { 00:27:00.992 "name": null, 00:27:00.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.992 "is_configured": false, 00:27:00.992 "data_offset": 2048, 00:27:00.992 "data_size": 63488 00:27:00.992 }, 00:27:00.992 { 00:27:00.992 "name": "BaseBdev3", 00:27:00.992 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:00.992 "is_configured": true, 00:27:00.992 "data_offset": 2048, 00:27:00.992 "data_size": 63488 00:27:00.992 }, 00:27:00.992 { 00:27:00.992 "name": "BaseBdev4", 00:27:00.992 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:00.992 "is_configured": true, 00:27:00.992 "data_offset": 2048, 00:27:00.992 "data_size": 63488 00:27:00.992 } 00:27:00.992 ] 00:27:00.992 }' 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=993 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.992 22:55:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:00.992 [2024-07-15 22:55:45.809436] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:27:01.250 [2024-07-15 22:55:46.041253] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:27:01.250 [2024-07-15 22:55:46.041553] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:27:01.250 22:55:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.250 "name": "raid_bdev1", 00:27:01.250 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:01.250 "strip_size_kb": 0, 00:27:01.250 "state": "online", 00:27:01.251 "raid_level": "raid1", 00:27:01.251 "superblock": true, 00:27:01.251 "num_base_bdevs": 4, 00:27:01.251 "num_base_bdevs_discovered": 3, 00:27:01.251 "num_base_bdevs_operational": 3, 00:27:01.251 "process": { 00:27:01.251 "type": "rebuild", 00:27:01.251 "target": "spare", 00:27:01.251 "progress": { 00:27:01.251 "blocks": 26624, 00:27:01.251 "percent": 41 00:27:01.251 } 00:27:01.251 }, 00:27:01.251 "base_bdevs_list": [ 00:27:01.251 { 00:27:01.251 "name": "spare", 00:27:01.251 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:01.251 "is_configured": true, 00:27:01.251 "data_offset": 2048, 00:27:01.251 "data_size": 63488 00:27:01.251 }, 00:27:01.251 { 00:27:01.251 "name": null, 00:27:01.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.251 "is_configured": false, 00:27:01.251 "data_offset": 2048, 00:27:01.251 "data_size": 63488 00:27:01.251 }, 00:27:01.251 { 00:27:01.251 "name": "BaseBdev3", 00:27:01.251 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:01.251 "is_configured": true, 00:27:01.251 "data_offset": 2048, 00:27:01.251 "data_size": 63488 00:27:01.251 }, 00:27:01.251 { 00:27:01.251 "name": "BaseBdev4", 00:27:01.251 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:01.251 "is_configured": true, 00:27:01.251 "data_offset": 2048, 00:27:01.251 "data_size": 63488 00:27:01.251 } 00:27:01.251 ] 00:27:01.251 }' 00:27:01.251 22:55:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.251 22:55:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:01.251 22:55:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:01.251 22:55:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:01.251 22:55:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:01.509 [2024-07-15 22:55:46.375414] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:27:02.076 [2024-07-15 22:55:46.853270] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.334 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.593 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:02.593 "name": "raid_bdev1", 00:27:02.593 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:02.593 "strip_size_kb": 0, 00:27:02.593 "state": "online", 00:27:02.593 "raid_level": "raid1", 00:27:02.593 "superblock": true, 00:27:02.593 "num_base_bdevs": 4, 00:27:02.593 "num_base_bdevs_discovered": 3, 00:27:02.593 "num_base_bdevs_operational": 3, 00:27:02.593 "process": { 00:27:02.593 "type": "rebuild", 00:27:02.593 "target": "spare", 00:27:02.593 "progress": { 00:27:02.593 "blocks": 47104, 00:27:02.593 "percent": 74 00:27:02.593 } 00:27:02.593 }, 00:27:02.593 "base_bdevs_list": [ 00:27:02.593 { 00:27:02.593 "name": "spare", 00:27:02.593 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:02.593 "is_configured": true, 00:27:02.593 "data_offset": 2048, 00:27:02.593 "data_size": 63488 00:27:02.593 }, 00:27:02.593 { 00:27:02.593 "name": null, 00:27:02.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.593 "is_configured": false, 00:27:02.593 "data_offset": 2048, 00:27:02.593 "data_size": 63488 00:27:02.593 }, 00:27:02.593 { 00:27:02.593 "name": "BaseBdev3", 00:27:02.593 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:02.593 "is_configured": true, 00:27:02.593 "data_offset": 2048, 00:27:02.593 "data_size": 63488 00:27:02.593 }, 00:27:02.593 { 00:27:02.593 "name": "BaseBdev4", 00:27:02.593 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:02.593 "is_configured": true, 00:27:02.593 "data_offset": 2048, 00:27:02.593 "data_size": 63488 00:27:02.593 } 00:27:02.593 ] 00:27:02.593 }' 00:27:02.593 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:02.593 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:02.593 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:02.593 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:02.593 22:55:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:02.852 [2024-07-15 22:55:47.543788] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:27:02.852 [2024-07-15 22:55:47.655831] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:27:02.852 [2024-07-15 22:55:47.656022] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:27:03.110 [2024-07-15 22:55:47.923703] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:27:03.368 [2024-07-15 22:55:48.134300] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:27:03.368 [2024-07-15 22:55:48.134629] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.626 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.626 [2024-07-15 22:55:48.478988] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:03.884 [2024-07-15 22:55:48.587261] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:03.884 [2024-07-15 22:55:48.591374] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:03.884 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:03.884 "name": "raid_bdev1", 00:27:03.884 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:03.884 "strip_size_kb": 0, 00:27:03.884 "state": "online", 00:27:03.884 "raid_level": "raid1", 00:27:03.884 "superblock": true, 00:27:03.884 "num_base_bdevs": 4, 00:27:03.884 "num_base_bdevs_discovered": 3, 00:27:03.884 "num_base_bdevs_operational": 3, 00:27:03.884 "base_bdevs_list": [ 00:27:03.884 { 00:27:03.884 "name": "spare", 00:27:03.884 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:03.884 "is_configured": true, 00:27:03.884 "data_offset": 2048, 00:27:03.884 "data_size": 63488 00:27:03.884 }, 00:27:03.884 { 00:27:03.884 "name": null, 00:27:03.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.884 "is_configured": false, 00:27:03.885 "data_offset": 2048, 00:27:03.885 "data_size": 63488 00:27:03.885 }, 00:27:03.885 { 00:27:03.885 "name": "BaseBdev3", 00:27:03.885 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:03.885 "is_configured": true, 00:27:03.885 "data_offset": 2048, 00:27:03.885 "data_size": 63488 00:27:03.885 }, 00:27:03.885 { 00:27:03.885 "name": "BaseBdev4", 00:27:03.885 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:03.885 "is_configured": true, 00:27:03.885 "data_offset": 2048, 00:27:03.885 "data_size": 63488 00:27:03.885 } 00:27:03.885 ] 00:27:03.885 }' 00:27:03.885 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.885 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:03.885 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.143 22:55:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:04.401 "name": "raid_bdev1", 00:27:04.401 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:04.401 "strip_size_kb": 0, 00:27:04.401 "state": "online", 00:27:04.401 "raid_level": "raid1", 00:27:04.401 "superblock": true, 00:27:04.401 "num_base_bdevs": 4, 00:27:04.401 "num_base_bdevs_discovered": 3, 00:27:04.401 "num_base_bdevs_operational": 3, 00:27:04.401 "base_bdevs_list": [ 00:27:04.401 { 00:27:04.401 "name": "spare", 00:27:04.401 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:04.401 "is_configured": true, 00:27:04.401 "data_offset": 2048, 00:27:04.401 "data_size": 63488 00:27:04.401 }, 00:27:04.401 { 00:27:04.401 "name": null, 00:27:04.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:04.401 "is_configured": false, 00:27:04.401 "data_offset": 2048, 00:27:04.401 "data_size": 63488 00:27:04.401 }, 00:27:04.401 { 00:27:04.401 "name": "BaseBdev3", 00:27:04.401 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:04.401 "is_configured": true, 00:27:04.401 "data_offset": 2048, 00:27:04.401 "data_size": 63488 00:27:04.401 }, 00:27:04.401 { 00:27:04.401 "name": "BaseBdev4", 00:27:04.401 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:04.401 "is_configured": true, 00:27:04.401 "data_offset": 2048, 00:27:04.401 "data_size": 63488 00:27:04.401 } 00:27:04.401 ] 00:27:04.401 }' 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.401 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.659 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.659 "name": "raid_bdev1", 00:27:04.659 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:04.659 "strip_size_kb": 0, 00:27:04.659 "state": "online", 00:27:04.659 "raid_level": "raid1", 00:27:04.659 "superblock": true, 00:27:04.659 "num_base_bdevs": 4, 00:27:04.659 "num_base_bdevs_discovered": 3, 00:27:04.659 "num_base_bdevs_operational": 3, 00:27:04.659 "base_bdevs_list": [ 00:27:04.659 { 00:27:04.659 "name": "spare", 00:27:04.659 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:04.659 "is_configured": true, 00:27:04.659 "data_offset": 2048, 00:27:04.659 "data_size": 63488 00:27:04.659 }, 00:27:04.659 { 00:27:04.659 "name": null, 00:27:04.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:04.659 "is_configured": false, 00:27:04.659 "data_offset": 2048, 00:27:04.659 "data_size": 63488 00:27:04.659 }, 00:27:04.659 { 00:27:04.659 "name": "BaseBdev3", 00:27:04.659 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:04.659 "is_configured": true, 00:27:04.659 "data_offset": 2048, 00:27:04.659 "data_size": 63488 00:27:04.659 }, 00:27:04.659 { 00:27:04.659 "name": "BaseBdev4", 00:27:04.659 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:04.659 "is_configured": true, 00:27:04.659 "data_offset": 2048, 00:27:04.659 "data_size": 63488 00:27:04.659 } 00:27:04.659 ] 00:27:04.659 }' 00:27:04.659 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.659 22:55:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:05.226 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:05.544 [2024-07-15 22:55:50.178130] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:05.544 [2024-07-15 22:55:50.178166] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:05.544 00:27:05.544 Latency(us) 00:27:05.544 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:05.544 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:27:05.544 raid_bdev1 : 11.35 89.43 268.29 0.00 0.00 14864.53 300.97 122181.90 00:27:05.544 =================================================================================================================== 00:27:05.544 Total : 89.43 268.29 0.00 0.00 14864.53 300.97 122181.90 00:27:05.544 [2024-07-15 22:55:50.250251] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:05.544 [2024-07-15 22:55:50.250279] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:05.544 [2024-07-15 22:55:50.250373] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:05.544 [2024-07-15 22:55:50.250386] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11d88a0 name raid_bdev1, state offline 00:27:05.544 0 00:27:05.544 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.544 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:05.802 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:05.803 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:05.803 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:27:06.062 /dev/nbd0 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:06.062 1+0 records in 00:27:06.062 1+0 records out 00:27:06.062 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000764434 s, 5.4 MB/s 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:06.062 22:55:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:27:06.321 /dev/nbd1 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:06.321 1+0 records in 00:27:06.321 1+0 records out 00:27:06.321 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292053 s, 14.0 MB/s 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:06.321 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:06.580 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:06.839 /dev/nbd1 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:06.839 1+0 records in 00:27:06.839 1+0 records out 00:27:06.839 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275173 s, 14.9 MB/s 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:06.839 22:55:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:07.098 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:07.098 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:07.358 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:07.616 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:07.874 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:07.874 [2024-07-15 22:55:52.771242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:07.874 [2024-07-15 22:55:52.771289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:07.874 [2024-07-15 22:55:52.771311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11da380 00:27:07.874 [2024-07-15 22:55:52.771324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:07.875 [2024-07-15 22:55:52.772973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:07.875 [2024-07-15 22:55:52.773004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:07.875 [2024-07-15 22:55:52.773096] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:07.875 [2024-07-15 22:55:52.773125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:07.875 [2024-07-15 22:55:52.773234] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:07.875 [2024-07-15 22:55:52.773308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:07.875 spare 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.133 22:55:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.133 [2024-07-15 22:55:52.873628] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11d8cc0 00:27:08.133 [2024-07-15 22:55:52.873644] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:08.133 [2024-07-15 22:55:52.873850] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10a6e90 00:27:08.133 [2024-07-15 22:55:52.874016] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11d8cc0 00:27:08.133 [2024-07-15 22:55:52.874027] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11d8cc0 00:27:08.133 [2024-07-15 22:55:52.874134] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:08.392 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.392 "name": "raid_bdev1", 00:27:08.392 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:08.392 "strip_size_kb": 0, 00:27:08.392 "state": "online", 00:27:08.392 "raid_level": "raid1", 00:27:08.392 "superblock": true, 00:27:08.392 "num_base_bdevs": 4, 00:27:08.392 "num_base_bdevs_discovered": 3, 00:27:08.392 "num_base_bdevs_operational": 3, 00:27:08.392 "base_bdevs_list": [ 00:27:08.392 { 00:27:08.392 "name": "spare", 00:27:08.392 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:08.392 "is_configured": true, 00:27:08.392 "data_offset": 2048, 00:27:08.392 "data_size": 63488 00:27:08.392 }, 00:27:08.392 { 00:27:08.392 "name": null, 00:27:08.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.392 "is_configured": false, 00:27:08.392 "data_offset": 2048, 00:27:08.392 "data_size": 63488 00:27:08.392 }, 00:27:08.392 { 00:27:08.392 "name": "BaseBdev3", 00:27:08.392 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:08.392 "is_configured": true, 00:27:08.392 "data_offset": 2048, 00:27:08.392 "data_size": 63488 00:27:08.392 }, 00:27:08.392 { 00:27:08.392 "name": "BaseBdev4", 00:27:08.392 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:08.392 "is_configured": true, 00:27:08.392 "data_offset": 2048, 00:27:08.392 "data_size": 63488 00:27:08.392 } 00:27:08.392 ] 00:27:08.392 }' 00:27:08.392 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.392 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:08.957 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:08.957 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:08.957 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:08.957 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:08.957 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:08.957 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.957 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.216 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:09.216 "name": "raid_bdev1", 00:27:09.216 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:09.216 "strip_size_kb": 0, 00:27:09.216 "state": "online", 00:27:09.216 "raid_level": "raid1", 00:27:09.216 "superblock": true, 00:27:09.216 "num_base_bdevs": 4, 00:27:09.216 "num_base_bdevs_discovered": 3, 00:27:09.216 "num_base_bdevs_operational": 3, 00:27:09.216 "base_bdevs_list": [ 00:27:09.216 { 00:27:09.216 "name": "spare", 00:27:09.216 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:09.216 "is_configured": true, 00:27:09.216 "data_offset": 2048, 00:27:09.216 "data_size": 63488 00:27:09.216 }, 00:27:09.216 { 00:27:09.216 "name": null, 00:27:09.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:09.216 "is_configured": false, 00:27:09.216 "data_offset": 2048, 00:27:09.216 "data_size": 63488 00:27:09.216 }, 00:27:09.216 { 00:27:09.216 "name": "BaseBdev3", 00:27:09.216 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:09.216 "is_configured": true, 00:27:09.216 "data_offset": 2048, 00:27:09.216 "data_size": 63488 00:27:09.216 }, 00:27:09.216 { 00:27:09.216 "name": "BaseBdev4", 00:27:09.216 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:09.216 "is_configured": true, 00:27:09.216 "data_offset": 2048, 00:27:09.216 "data_size": 63488 00:27:09.216 } 00:27:09.216 ] 00:27:09.216 }' 00:27:09.216 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:09.216 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:09.216 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:09.216 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:09.216 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.216 22:55:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:09.474 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:09.474 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:09.731 [2024-07-15 22:55:54.452045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.731 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.989 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:09.989 "name": "raid_bdev1", 00:27:09.989 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:09.989 "strip_size_kb": 0, 00:27:09.989 "state": "online", 00:27:09.989 "raid_level": "raid1", 00:27:09.989 "superblock": true, 00:27:09.989 "num_base_bdevs": 4, 00:27:09.989 "num_base_bdevs_discovered": 2, 00:27:09.989 "num_base_bdevs_operational": 2, 00:27:09.989 "base_bdevs_list": [ 00:27:09.989 { 00:27:09.989 "name": null, 00:27:09.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:09.989 "is_configured": false, 00:27:09.989 "data_offset": 2048, 00:27:09.989 "data_size": 63488 00:27:09.989 }, 00:27:09.989 { 00:27:09.989 "name": null, 00:27:09.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:09.989 "is_configured": false, 00:27:09.989 "data_offset": 2048, 00:27:09.989 "data_size": 63488 00:27:09.989 }, 00:27:09.989 { 00:27:09.989 "name": "BaseBdev3", 00:27:09.989 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:09.989 "is_configured": true, 00:27:09.989 "data_offset": 2048, 00:27:09.989 "data_size": 63488 00:27:09.989 }, 00:27:09.989 { 00:27:09.989 "name": "BaseBdev4", 00:27:09.989 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:09.989 "is_configured": true, 00:27:09.989 "data_offset": 2048, 00:27:09.989 "data_size": 63488 00:27:09.989 } 00:27:09.989 ] 00:27:09.989 }' 00:27:09.989 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:09.989 22:55:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:10.554 22:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:10.812 [2024-07-15 22:55:55.563264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:10.812 [2024-07-15 22:55:55.563423] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:10.812 [2024-07-15 22:55:55.563440] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:10.812 [2024-07-15 22:55:55.563476] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:10.812 [2024-07-15 22:55:55.567918] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1258c00 00:27:10.812 [2024-07-15 22:55:55.570310] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:10.812 22:55:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:11.745 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:11.746 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:11.746 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:11.746 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:11.746 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:11.746 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.746 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.003 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:12.003 "name": "raid_bdev1", 00:27:12.003 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:12.003 "strip_size_kb": 0, 00:27:12.003 "state": "online", 00:27:12.003 "raid_level": "raid1", 00:27:12.003 "superblock": true, 00:27:12.003 "num_base_bdevs": 4, 00:27:12.003 "num_base_bdevs_discovered": 3, 00:27:12.003 "num_base_bdevs_operational": 3, 00:27:12.003 "process": { 00:27:12.003 "type": "rebuild", 00:27:12.003 "target": "spare", 00:27:12.003 "progress": { 00:27:12.003 "blocks": 24576, 00:27:12.003 "percent": 38 00:27:12.003 } 00:27:12.003 }, 00:27:12.003 "base_bdevs_list": [ 00:27:12.003 { 00:27:12.003 "name": "spare", 00:27:12.003 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:12.003 "is_configured": true, 00:27:12.003 "data_offset": 2048, 00:27:12.003 "data_size": 63488 00:27:12.003 }, 00:27:12.003 { 00:27:12.003 "name": null, 00:27:12.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.004 "is_configured": false, 00:27:12.004 "data_offset": 2048, 00:27:12.004 "data_size": 63488 00:27:12.004 }, 00:27:12.004 { 00:27:12.004 "name": "BaseBdev3", 00:27:12.004 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:12.004 "is_configured": true, 00:27:12.004 "data_offset": 2048, 00:27:12.004 "data_size": 63488 00:27:12.004 }, 00:27:12.004 { 00:27:12.004 "name": "BaseBdev4", 00:27:12.004 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:12.004 "is_configured": true, 00:27:12.004 "data_offset": 2048, 00:27:12.004 "data_size": 63488 00:27:12.004 } 00:27:12.004 ] 00:27:12.004 }' 00:27:12.004 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.004 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:12.004 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.261 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:12.261 22:55:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:12.261 [2024-07-15 22:55:57.162286] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:12.518 [2024-07-15 22:55:57.182740] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:12.518 [2024-07-15 22:55:57.182785] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:12.518 [2024-07-15 22:55:57.182801] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:12.518 [2024-07-15 22:55:57.182810] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.518 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.776 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.776 "name": "raid_bdev1", 00:27:12.776 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:12.776 "strip_size_kb": 0, 00:27:12.776 "state": "online", 00:27:12.776 "raid_level": "raid1", 00:27:12.776 "superblock": true, 00:27:12.776 "num_base_bdevs": 4, 00:27:12.776 "num_base_bdevs_discovered": 2, 00:27:12.776 "num_base_bdevs_operational": 2, 00:27:12.776 "base_bdevs_list": [ 00:27:12.776 { 00:27:12.776 "name": null, 00:27:12.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.776 "is_configured": false, 00:27:12.776 "data_offset": 2048, 00:27:12.776 "data_size": 63488 00:27:12.776 }, 00:27:12.776 { 00:27:12.776 "name": null, 00:27:12.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.777 "is_configured": false, 00:27:12.777 "data_offset": 2048, 00:27:12.777 "data_size": 63488 00:27:12.777 }, 00:27:12.777 { 00:27:12.777 "name": "BaseBdev3", 00:27:12.777 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:12.777 "is_configured": true, 00:27:12.777 "data_offset": 2048, 00:27:12.777 "data_size": 63488 00:27:12.777 }, 00:27:12.777 { 00:27:12.777 "name": "BaseBdev4", 00:27:12.777 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:12.777 "is_configured": true, 00:27:12.777 "data_offset": 2048, 00:27:12.777 "data_size": 63488 00:27:12.777 } 00:27:12.777 ] 00:27:12.777 }' 00:27:12.777 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.777 22:55:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:13.344 22:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:13.603 [2024-07-15 22:55:58.298446] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:13.603 [2024-07-15 22:55:58.298498] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:13.603 [2024-07-15 22:55:58.298520] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11d9130 00:27:13.603 [2024-07-15 22:55:58.298533] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:13.603 [2024-07-15 22:55:58.298924] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:13.603 [2024-07-15 22:55:58.298953] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:13.603 [2024-07-15 22:55:58.299040] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:13.603 [2024-07-15 22:55:58.299052] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:13.603 [2024-07-15 22:55:58.299063] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:13.603 [2024-07-15 22:55:58.299082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:13.603 [2024-07-15 22:55:58.303556] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdaefa0 00:27:13.603 spare 00:27:13.603 [2024-07-15 22:55:58.304957] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:13.603 22:55:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:14.539 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:14.539 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:14.539 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:14.539 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:14.539 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:14.539 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.539 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.798 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:14.798 "name": "raid_bdev1", 00:27:14.798 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:14.798 "strip_size_kb": 0, 00:27:14.798 "state": "online", 00:27:14.798 "raid_level": "raid1", 00:27:14.798 "superblock": true, 00:27:14.798 "num_base_bdevs": 4, 00:27:14.798 "num_base_bdevs_discovered": 3, 00:27:14.798 "num_base_bdevs_operational": 3, 00:27:14.798 "process": { 00:27:14.798 "type": "rebuild", 00:27:14.798 "target": "spare", 00:27:14.798 "progress": { 00:27:14.798 "blocks": 24576, 00:27:14.798 "percent": 38 00:27:14.798 } 00:27:14.798 }, 00:27:14.798 "base_bdevs_list": [ 00:27:14.798 { 00:27:14.798 "name": "spare", 00:27:14.798 "uuid": "2cd990db-1827-5a50-86ed-c713d80707ed", 00:27:14.798 "is_configured": true, 00:27:14.798 "data_offset": 2048, 00:27:14.798 "data_size": 63488 00:27:14.798 }, 00:27:14.798 { 00:27:14.798 "name": null, 00:27:14.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.798 "is_configured": false, 00:27:14.798 "data_offset": 2048, 00:27:14.798 "data_size": 63488 00:27:14.798 }, 00:27:14.798 { 00:27:14.798 "name": "BaseBdev3", 00:27:14.798 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:14.798 "is_configured": true, 00:27:14.798 "data_offset": 2048, 00:27:14.798 "data_size": 63488 00:27:14.798 }, 00:27:14.798 { 00:27:14.798 "name": "BaseBdev4", 00:27:14.798 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:14.798 "is_configured": true, 00:27:14.798 "data_offset": 2048, 00:27:14.798 "data_size": 63488 00:27:14.798 } 00:27:14.798 ] 00:27:14.798 }' 00:27:14.798 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:14.798 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:14.798 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:14.798 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:14.798 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:15.057 [2024-07-15 22:55:59.881266] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:15.057 [2024-07-15 22:55:59.917762] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:15.057 [2024-07-15 22:55:59.917810] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.057 [2024-07-15 22:55:59.917826] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:15.057 [2024-07-15 22:55:59.917835] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.057 22:55:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.316 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.316 "name": "raid_bdev1", 00:27:15.316 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:15.316 "strip_size_kb": 0, 00:27:15.316 "state": "online", 00:27:15.316 "raid_level": "raid1", 00:27:15.316 "superblock": true, 00:27:15.316 "num_base_bdevs": 4, 00:27:15.316 "num_base_bdevs_discovered": 2, 00:27:15.316 "num_base_bdevs_operational": 2, 00:27:15.316 "base_bdevs_list": [ 00:27:15.316 { 00:27:15.316 "name": null, 00:27:15.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.316 "is_configured": false, 00:27:15.316 "data_offset": 2048, 00:27:15.316 "data_size": 63488 00:27:15.316 }, 00:27:15.316 { 00:27:15.316 "name": null, 00:27:15.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.316 "is_configured": false, 00:27:15.316 "data_offset": 2048, 00:27:15.316 "data_size": 63488 00:27:15.316 }, 00:27:15.316 { 00:27:15.316 "name": "BaseBdev3", 00:27:15.316 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:15.316 "is_configured": true, 00:27:15.316 "data_offset": 2048, 00:27:15.316 "data_size": 63488 00:27:15.316 }, 00:27:15.316 { 00:27:15.316 "name": "BaseBdev4", 00:27:15.316 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:15.316 "is_configured": true, 00:27:15.316 "data_offset": 2048, 00:27:15.316 "data_size": 63488 00:27:15.316 } 00:27:15.316 ] 00:27:15.316 }' 00:27:15.316 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.316 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:16.254 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:16.254 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:16.254 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:16.254 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:16.254 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:16.254 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.254 22:56:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.254 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:16.254 "name": "raid_bdev1", 00:27:16.254 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:16.254 "strip_size_kb": 0, 00:27:16.254 "state": "online", 00:27:16.254 "raid_level": "raid1", 00:27:16.254 "superblock": true, 00:27:16.254 "num_base_bdevs": 4, 00:27:16.254 "num_base_bdevs_discovered": 2, 00:27:16.254 "num_base_bdevs_operational": 2, 00:27:16.254 "base_bdevs_list": [ 00:27:16.254 { 00:27:16.254 "name": null, 00:27:16.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.254 "is_configured": false, 00:27:16.254 "data_offset": 2048, 00:27:16.254 "data_size": 63488 00:27:16.254 }, 00:27:16.254 { 00:27:16.254 "name": null, 00:27:16.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.254 "is_configured": false, 00:27:16.254 "data_offset": 2048, 00:27:16.254 "data_size": 63488 00:27:16.254 }, 00:27:16.254 { 00:27:16.254 "name": "BaseBdev3", 00:27:16.254 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:16.254 "is_configured": true, 00:27:16.254 "data_offset": 2048, 00:27:16.254 "data_size": 63488 00:27:16.254 }, 00:27:16.254 { 00:27:16.254 "name": "BaseBdev4", 00:27:16.254 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:16.254 "is_configured": true, 00:27:16.254 "data_offset": 2048, 00:27:16.254 "data_size": 63488 00:27:16.254 } 00:27:16.254 ] 00:27:16.254 }' 00:27:16.254 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:16.254 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:16.254 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:16.513 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:16.513 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:16.513 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:16.772 [2024-07-15 22:56:01.638684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:16.772 [2024-07-15 22:56:01.638735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:16.772 [2024-07-15 22:56:01.638756] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1257c70 00:27:16.772 [2024-07-15 22:56:01.638770] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:16.772 [2024-07-15 22:56:01.639135] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:16.772 [2024-07-15 22:56:01.639156] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:16.772 [2024-07-15 22:56:01.639226] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:16.772 [2024-07-15 22:56:01.639238] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:16.772 [2024-07-15 22:56:01.639250] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:16.772 BaseBdev1 00:27:16.772 22:56:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.150 "name": "raid_bdev1", 00:27:18.150 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:18.150 "strip_size_kb": 0, 00:27:18.150 "state": "online", 00:27:18.150 "raid_level": "raid1", 00:27:18.150 "superblock": true, 00:27:18.150 "num_base_bdevs": 4, 00:27:18.150 "num_base_bdevs_discovered": 2, 00:27:18.150 "num_base_bdevs_operational": 2, 00:27:18.150 "base_bdevs_list": [ 00:27:18.150 { 00:27:18.150 "name": null, 00:27:18.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.150 "is_configured": false, 00:27:18.150 "data_offset": 2048, 00:27:18.150 "data_size": 63488 00:27:18.150 }, 00:27:18.150 { 00:27:18.150 "name": null, 00:27:18.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.150 "is_configured": false, 00:27:18.150 "data_offset": 2048, 00:27:18.150 "data_size": 63488 00:27:18.150 }, 00:27:18.150 { 00:27:18.150 "name": "BaseBdev3", 00:27:18.150 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:18.150 "is_configured": true, 00:27:18.150 "data_offset": 2048, 00:27:18.150 "data_size": 63488 00:27:18.150 }, 00:27:18.150 { 00:27:18.150 "name": "BaseBdev4", 00:27:18.150 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:18.150 "is_configured": true, 00:27:18.150 "data_offset": 2048, 00:27:18.150 "data_size": 63488 00:27:18.150 } 00:27:18.150 ] 00:27:18.150 }' 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.150 22:56:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:18.717 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:18.717 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:18.717 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:18.717 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:18.717 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:18.717 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.717 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.975 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:18.975 "name": "raid_bdev1", 00:27:18.975 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:18.975 "strip_size_kb": 0, 00:27:18.975 "state": "online", 00:27:18.975 "raid_level": "raid1", 00:27:18.975 "superblock": true, 00:27:18.975 "num_base_bdevs": 4, 00:27:18.975 "num_base_bdevs_discovered": 2, 00:27:18.975 "num_base_bdevs_operational": 2, 00:27:18.975 "base_bdevs_list": [ 00:27:18.975 { 00:27:18.975 "name": null, 00:27:18.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.975 "is_configured": false, 00:27:18.975 "data_offset": 2048, 00:27:18.975 "data_size": 63488 00:27:18.975 }, 00:27:18.975 { 00:27:18.975 "name": null, 00:27:18.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.975 "is_configured": false, 00:27:18.975 "data_offset": 2048, 00:27:18.975 "data_size": 63488 00:27:18.975 }, 00:27:18.975 { 00:27:18.975 "name": "BaseBdev3", 00:27:18.975 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:18.975 "is_configured": true, 00:27:18.975 "data_offset": 2048, 00:27:18.975 "data_size": 63488 00:27:18.975 }, 00:27:18.975 { 00:27:18.975 "name": "BaseBdev4", 00:27:18.975 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:18.975 "is_configured": true, 00:27:18.975 "data_offset": 2048, 00:27:18.975 "data_size": 63488 00:27:18.975 } 00:27:18.975 ] 00:27:18.975 }' 00:27:18.975 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:18.975 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:18.975 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:19.234 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:19.234 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:19.234 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:27:19.234 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:19.235 22:56:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:19.494 [2024-07-15 22:56:04.149736] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:19.494 [2024-07-15 22:56:04.149874] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:19.494 [2024-07-15 22:56:04.149896] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:19.494 request: 00:27:19.494 { 00:27:19.494 "base_bdev": "BaseBdev1", 00:27:19.494 "raid_bdev": "raid_bdev1", 00:27:19.494 "method": "bdev_raid_add_base_bdev", 00:27:19.494 "req_id": 1 00:27:19.494 } 00:27:19.494 Got JSON-RPC error response 00:27:19.494 response: 00:27:19.494 { 00:27:19.494 "code": -22, 00:27:19.494 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:19.494 } 00:27:19.494 22:56:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:27:19.494 22:56:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:19.494 22:56:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:19.494 22:56:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:19.494 22:56:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.504 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.763 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:20.763 "name": "raid_bdev1", 00:27:20.763 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:20.763 "strip_size_kb": 0, 00:27:20.763 "state": "online", 00:27:20.763 "raid_level": "raid1", 00:27:20.763 "superblock": true, 00:27:20.763 "num_base_bdevs": 4, 00:27:20.763 "num_base_bdevs_discovered": 2, 00:27:20.763 "num_base_bdevs_operational": 2, 00:27:20.763 "base_bdevs_list": [ 00:27:20.763 { 00:27:20.763 "name": null, 00:27:20.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.763 "is_configured": false, 00:27:20.763 "data_offset": 2048, 00:27:20.763 "data_size": 63488 00:27:20.763 }, 00:27:20.763 { 00:27:20.763 "name": null, 00:27:20.763 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:20.763 "is_configured": false, 00:27:20.763 "data_offset": 2048, 00:27:20.763 "data_size": 63488 00:27:20.763 }, 00:27:20.763 { 00:27:20.763 "name": "BaseBdev3", 00:27:20.763 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:20.763 "is_configured": true, 00:27:20.763 "data_offset": 2048, 00:27:20.763 "data_size": 63488 00:27:20.763 }, 00:27:20.763 { 00:27:20.763 "name": "BaseBdev4", 00:27:20.763 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:20.763 "is_configured": true, 00:27:20.763 "data_offset": 2048, 00:27:20.763 "data_size": 63488 00:27:20.763 } 00:27:20.763 ] 00:27:20.763 }' 00:27:20.763 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:20.763 22:56:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:21.332 "name": "raid_bdev1", 00:27:21.332 "uuid": "ce50295a-9fa1-4275-881f-db851a89f47c", 00:27:21.332 "strip_size_kb": 0, 00:27:21.332 "state": "online", 00:27:21.332 "raid_level": "raid1", 00:27:21.332 "superblock": true, 00:27:21.332 "num_base_bdevs": 4, 00:27:21.332 "num_base_bdevs_discovered": 2, 00:27:21.332 "num_base_bdevs_operational": 2, 00:27:21.332 "base_bdevs_list": [ 00:27:21.332 { 00:27:21.332 "name": null, 00:27:21.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.332 "is_configured": false, 00:27:21.332 "data_offset": 2048, 00:27:21.332 "data_size": 63488 00:27:21.332 }, 00:27:21.332 { 00:27:21.332 "name": null, 00:27:21.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.332 "is_configured": false, 00:27:21.332 "data_offset": 2048, 00:27:21.332 "data_size": 63488 00:27:21.332 }, 00:27:21.332 { 00:27:21.332 "name": "BaseBdev3", 00:27:21.332 "uuid": "64359613-d4e5-507a-bbb7-e94a83b01643", 00:27:21.332 "is_configured": true, 00:27:21.332 "data_offset": 2048, 00:27:21.332 "data_size": 63488 00:27:21.332 }, 00:27:21.332 { 00:27:21.332 "name": "BaseBdev4", 00:27:21.332 "uuid": "a6f1c1c6-4c02-5c91-aeac-4540da6c1408", 00:27:21.332 "is_configured": true, 00:27:21.332 "data_offset": 2048, 00:27:21.332 "data_size": 63488 00:27:21.332 } 00:27:21.332 ] 00:27:21.332 }' 00:27:21.332 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2835859 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2835859 ']' 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2835859 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2835859 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2835859' 00:27:21.591 killing process with pid 2835859 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2835859 00:27:21.591 Received shutdown signal, test time was about 27.406100 seconds 00:27:21.591 00:27:21.591 Latency(us) 00:27:21.591 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:21.591 =================================================================================================================== 00:27:21.591 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:21.591 [2024-07-15 22:56:06.342039] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:21.591 [2024-07-15 22:56:06.342149] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:21.591 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2835859 00:27:21.591 [2024-07-15 22:56:06.342210] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:21.591 [2024-07-15 22:56:06.342223] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11d8cc0 name raid_bdev1, state offline 00:27:21.591 [2024-07-15 22:56:06.382361] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:21.849 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:27:21.849 00:27:21.849 real 0m32.262s 00:27:21.849 user 0m50.656s 00:27:21.849 sys 0m5.077s 00:27:21.849 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:21.849 22:56:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:21.849 ************************************ 00:27:21.849 END TEST raid_rebuild_test_sb_io 00:27:21.849 ************************************ 00:27:21.849 22:56:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:21.849 22:56:06 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:27:21.849 22:56:06 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:27:21.849 22:56:06 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:27:21.849 22:56:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:21.849 22:56:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:21.849 22:56:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:21.849 ************************************ 00:27:21.849 START TEST raid_state_function_test_sb_4k 00:27:21.849 ************************************ 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2840402 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2840402' 00:27:21.849 Process raid pid: 2840402 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2840402 /var/tmp/spdk-raid.sock 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2840402 ']' 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:21.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:21.849 22:56:06 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:21.849 [2024-07-15 22:56:06.751154] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:27:21.849 [2024-07-15 22:56:06.751218] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:22.105 [2024-07-15 22:56:06.879849] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:22.105 [2024-07-15 22:56:06.981660] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:22.363 [2024-07-15 22:56:07.046938] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:22.363 [2024-07-15 22:56:07.046974] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:22.928 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:22.928 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:22.928 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:23.187 [2024-07-15 22:56:07.918888] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:23.187 [2024-07-15 22:56:07.918940] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:23.187 [2024-07-15 22:56:07.918952] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:23.187 [2024-07-15 22:56:07.918965] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.187 22:56:07 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:23.444 22:56:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.444 "name": "Existed_Raid", 00:27:23.444 "uuid": "fc107153-4deb-4ae4-83bf-af0247134d1a", 00:27:23.444 "strip_size_kb": 0, 00:27:23.444 "state": "configuring", 00:27:23.445 "raid_level": "raid1", 00:27:23.445 "superblock": true, 00:27:23.445 "num_base_bdevs": 2, 00:27:23.445 "num_base_bdevs_discovered": 0, 00:27:23.445 "num_base_bdevs_operational": 2, 00:27:23.445 "base_bdevs_list": [ 00:27:23.445 { 00:27:23.445 "name": "BaseBdev1", 00:27:23.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.445 "is_configured": false, 00:27:23.445 "data_offset": 0, 00:27:23.445 "data_size": 0 00:27:23.445 }, 00:27:23.445 { 00:27:23.445 "name": "BaseBdev2", 00:27:23.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.445 "is_configured": false, 00:27:23.445 "data_offset": 0, 00:27:23.445 "data_size": 0 00:27:23.445 } 00:27:23.445 ] 00:27:23.445 }' 00:27:23.445 22:56:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.445 22:56:08 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:24.010 22:56:08 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:24.268 [2024-07-15 22:56:09.013635] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:24.268 [2024-07-15 22:56:09.013668] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb6a80 name Existed_Raid, state configuring 00:27:24.268 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:24.526 [2024-07-15 22:56:09.262305] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:24.526 [2024-07-15 22:56:09.262331] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:24.526 [2024-07-15 22:56:09.262341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:24.526 [2024-07-15 22:56:09.262352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:24.526 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:27:24.784 [2024-07-15 22:56:09.520834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:24.784 BaseBdev1 00:27:24.784 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:24.784 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:24.784 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:24.784 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:24.784 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:24.785 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:24.785 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:25.043 22:56:09 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:25.301 [ 00:27:25.301 { 00:27:25.301 "name": "BaseBdev1", 00:27:25.301 "aliases": [ 00:27:25.301 "540f0f49-391a-4ea8-8ed9-0d6627d7912c" 00:27:25.301 ], 00:27:25.301 "product_name": "Malloc disk", 00:27:25.301 "block_size": 4096, 00:27:25.301 "num_blocks": 8192, 00:27:25.301 "uuid": "540f0f49-391a-4ea8-8ed9-0d6627d7912c", 00:27:25.301 "assigned_rate_limits": { 00:27:25.301 "rw_ios_per_sec": 0, 00:27:25.301 "rw_mbytes_per_sec": 0, 00:27:25.301 "r_mbytes_per_sec": 0, 00:27:25.301 "w_mbytes_per_sec": 0 00:27:25.301 }, 00:27:25.301 "claimed": true, 00:27:25.301 "claim_type": "exclusive_write", 00:27:25.301 "zoned": false, 00:27:25.301 "supported_io_types": { 00:27:25.301 "read": true, 00:27:25.301 "write": true, 00:27:25.301 "unmap": true, 00:27:25.301 "flush": true, 00:27:25.301 "reset": true, 00:27:25.301 "nvme_admin": false, 00:27:25.301 "nvme_io": false, 00:27:25.301 "nvme_io_md": false, 00:27:25.301 "write_zeroes": true, 00:27:25.301 "zcopy": true, 00:27:25.301 "get_zone_info": false, 00:27:25.301 "zone_management": false, 00:27:25.301 "zone_append": false, 00:27:25.301 "compare": false, 00:27:25.301 "compare_and_write": false, 00:27:25.301 "abort": true, 00:27:25.301 "seek_hole": false, 00:27:25.301 "seek_data": false, 00:27:25.301 "copy": true, 00:27:25.301 "nvme_iov_md": false 00:27:25.301 }, 00:27:25.301 "memory_domains": [ 00:27:25.301 { 00:27:25.301 "dma_device_id": "system", 00:27:25.301 "dma_device_type": 1 00:27:25.301 }, 00:27:25.301 { 00:27:25.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:25.301 "dma_device_type": 2 00:27:25.301 } 00:27:25.301 ], 00:27:25.301 "driver_specific": {} 00:27:25.301 } 00:27:25.301 ] 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.301 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.302 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.302 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.302 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:25.559 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:25.559 "name": "Existed_Raid", 00:27:25.559 "uuid": "a4dce6b3-1339-463b-bfbf-1d0aae7d80b2", 00:27:25.559 "strip_size_kb": 0, 00:27:25.559 "state": "configuring", 00:27:25.559 "raid_level": "raid1", 00:27:25.559 "superblock": true, 00:27:25.559 "num_base_bdevs": 2, 00:27:25.559 "num_base_bdevs_discovered": 1, 00:27:25.559 "num_base_bdevs_operational": 2, 00:27:25.559 "base_bdevs_list": [ 00:27:25.559 { 00:27:25.559 "name": "BaseBdev1", 00:27:25.559 "uuid": "540f0f49-391a-4ea8-8ed9-0d6627d7912c", 00:27:25.559 "is_configured": true, 00:27:25.559 "data_offset": 256, 00:27:25.559 "data_size": 7936 00:27:25.559 }, 00:27:25.559 { 00:27:25.559 "name": "BaseBdev2", 00:27:25.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.559 "is_configured": false, 00:27:25.559 "data_offset": 0, 00:27:25.559 "data_size": 0 00:27:25.559 } 00:27:25.559 ] 00:27:25.559 }' 00:27:25.559 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:25.559 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:26.123 22:56:10 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:26.381 [2024-07-15 22:56:11.049070] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:26.381 [2024-07-15 22:56:11.049116] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb6350 name Existed_Raid, state configuring 00:27:26.381 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:26.638 [2024-07-15 22:56:11.309787] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:26.638 [2024-07-15 22:56:11.311384] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:26.638 [2024-07-15 22:56:11.311416] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.638 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:26.895 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.895 "name": "Existed_Raid", 00:27:26.895 "uuid": "14383702-2db4-448b-874a-fdef3a7f8d7b", 00:27:26.895 "strip_size_kb": 0, 00:27:26.895 "state": "configuring", 00:27:26.895 "raid_level": "raid1", 00:27:26.895 "superblock": true, 00:27:26.895 "num_base_bdevs": 2, 00:27:26.895 "num_base_bdevs_discovered": 1, 00:27:26.895 "num_base_bdevs_operational": 2, 00:27:26.895 "base_bdevs_list": [ 00:27:26.895 { 00:27:26.895 "name": "BaseBdev1", 00:27:26.895 "uuid": "540f0f49-391a-4ea8-8ed9-0d6627d7912c", 00:27:26.895 "is_configured": true, 00:27:26.895 "data_offset": 256, 00:27:26.895 "data_size": 7936 00:27:26.895 }, 00:27:26.895 { 00:27:26.895 "name": "BaseBdev2", 00:27:26.895 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.896 "is_configured": false, 00:27:26.896 "data_offset": 0, 00:27:26.896 "data_size": 0 00:27:26.896 } 00:27:26.896 ] 00:27:26.896 }' 00:27:26.896 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.896 22:56:11 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:27.458 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:27:27.715 [2024-07-15 22:56:12.396126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:27.715 [2024-07-15 22:56:12.396284] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bb7000 00:27:27.715 [2024-07-15 22:56:12.396298] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:27.716 [2024-07-15 22:56:12.396474] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ad10c0 00:27:27.716 [2024-07-15 22:56:12.396600] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bb7000 00:27:27.716 [2024-07-15 22:56:12.396610] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bb7000 00:27:27.716 [2024-07-15 22:56:12.396705] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:27.716 BaseBdev2 00:27:27.716 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:27.716 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:27.716 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:27.716 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:27.716 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:27.716 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:27.716 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:27.973 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:28.231 [ 00:27:28.231 { 00:27:28.231 "name": "BaseBdev2", 00:27:28.231 "aliases": [ 00:27:28.231 "667a57b1-1fc7-4495-ae09-706e47abd55d" 00:27:28.231 ], 00:27:28.231 "product_name": "Malloc disk", 00:27:28.231 "block_size": 4096, 00:27:28.231 "num_blocks": 8192, 00:27:28.231 "uuid": "667a57b1-1fc7-4495-ae09-706e47abd55d", 00:27:28.231 "assigned_rate_limits": { 00:27:28.231 "rw_ios_per_sec": 0, 00:27:28.231 "rw_mbytes_per_sec": 0, 00:27:28.231 "r_mbytes_per_sec": 0, 00:27:28.231 "w_mbytes_per_sec": 0 00:27:28.231 }, 00:27:28.231 "claimed": true, 00:27:28.231 "claim_type": "exclusive_write", 00:27:28.231 "zoned": false, 00:27:28.231 "supported_io_types": { 00:27:28.231 "read": true, 00:27:28.231 "write": true, 00:27:28.231 "unmap": true, 00:27:28.231 "flush": true, 00:27:28.231 "reset": true, 00:27:28.231 "nvme_admin": false, 00:27:28.231 "nvme_io": false, 00:27:28.231 "nvme_io_md": false, 00:27:28.231 "write_zeroes": true, 00:27:28.231 "zcopy": true, 00:27:28.231 "get_zone_info": false, 00:27:28.231 "zone_management": false, 00:27:28.231 "zone_append": false, 00:27:28.231 "compare": false, 00:27:28.231 "compare_and_write": false, 00:27:28.231 "abort": true, 00:27:28.231 "seek_hole": false, 00:27:28.231 "seek_data": false, 00:27:28.231 "copy": true, 00:27:28.231 "nvme_iov_md": false 00:27:28.231 }, 00:27:28.231 "memory_domains": [ 00:27:28.231 { 00:27:28.231 "dma_device_id": "system", 00:27:28.231 "dma_device_type": 1 00:27:28.231 }, 00:27:28.231 { 00:27:28.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.231 "dma_device_type": 2 00:27:28.231 } 00:27:28.231 ], 00:27:28.231 "driver_specific": {} 00:27:28.231 } 00:27:28.231 ] 00:27:28.231 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:28.231 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:28.231 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:28.231 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.232 22:56:12 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:28.490 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:28.490 "name": "Existed_Raid", 00:27:28.490 "uuid": "14383702-2db4-448b-874a-fdef3a7f8d7b", 00:27:28.490 "strip_size_kb": 0, 00:27:28.490 "state": "online", 00:27:28.490 "raid_level": "raid1", 00:27:28.490 "superblock": true, 00:27:28.490 "num_base_bdevs": 2, 00:27:28.490 "num_base_bdevs_discovered": 2, 00:27:28.490 "num_base_bdevs_operational": 2, 00:27:28.490 "base_bdevs_list": [ 00:27:28.490 { 00:27:28.490 "name": "BaseBdev1", 00:27:28.490 "uuid": "540f0f49-391a-4ea8-8ed9-0d6627d7912c", 00:27:28.490 "is_configured": true, 00:27:28.490 "data_offset": 256, 00:27:28.490 "data_size": 7936 00:27:28.490 }, 00:27:28.490 { 00:27:28.490 "name": "BaseBdev2", 00:27:28.490 "uuid": "667a57b1-1fc7-4495-ae09-706e47abd55d", 00:27:28.490 "is_configured": true, 00:27:28.490 "data_offset": 256, 00:27:28.490 "data_size": 7936 00:27:28.490 } 00:27:28.490 ] 00:27:28.490 }' 00:27:28.490 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:28.490 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:29.056 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:29.057 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:29.057 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:29.057 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:29.057 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:29.057 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:29.057 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:29.057 22:56:13 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:29.316 [2024-07-15 22:56:13.980595] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:29.316 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:29.316 "name": "Existed_Raid", 00:27:29.316 "aliases": [ 00:27:29.316 "14383702-2db4-448b-874a-fdef3a7f8d7b" 00:27:29.316 ], 00:27:29.316 "product_name": "Raid Volume", 00:27:29.316 "block_size": 4096, 00:27:29.316 "num_blocks": 7936, 00:27:29.316 "uuid": "14383702-2db4-448b-874a-fdef3a7f8d7b", 00:27:29.316 "assigned_rate_limits": { 00:27:29.316 "rw_ios_per_sec": 0, 00:27:29.316 "rw_mbytes_per_sec": 0, 00:27:29.316 "r_mbytes_per_sec": 0, 00:27:29.316 "w_mbytes_per_sec": 0 00:27:29.316 }, 00:27:29.316 "claimed": false, 00:27:29.316 "zoned": false, 00:27:29.316 "supported_io_types": { 00:27:29.316 "read": true, 00:27:29.316 "write": true, 00:27:29.316 "unmap": false, 00:27:29.316 "flush": false, 00:27:29.316 "reset": true, 00:27:29.316 "nvme_admin": false, 00:27:29.316 "nvme_io": false, 00:27:29.316 "nvme_io_md": false, 00:27:29.316 "write_zeroes": true, 00:27:29.316 "zcopy": false, 00:27:29.316 "get_zone_info": false, 00:27:29.316 "zone_management": false, 00:27:29.316 "zone_append": false, 00:27:29.316 "compare": false, 00:27:29.316 "compare_and_write": false, 00:27:29.316 "abort": false, 00:27:29.316 "seek_hole": false, 00:27:29.316 "seek_data": false, 00:27:29.316 "copy": false, 00:27:29.316 "nvme_iov_md": false 00:27:29.316 }, 00:27:29.316 "memory_domains": [ 00:27:29.316 { 00:27:29.316 "dma_device_id": "system", 00:27:29.316 "dma_device_type": 1 00:27:29.316 }, 00:27:29.316 { 00:27:29.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:29.316 "dma_device_type": 2 00:27:29.316 }, 00:27:29.316 { 00:27:29.316 "dma_device_id": "system", 00:27:29.316 "dma_device_type": 1 00:27:29.316 }, 00:27:29.316 { 00:27:29.316 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:29.316 "dma_device_type": 2 00:27:29.316 } 00:27:29.316 ], 00:27:29.316 "driver_specific": { 00:27:29.316 "raid": { 00:27:29.316 "uuid": "14383702-2db4-448b-874a-fdef3a7f8d7b", 00:27:29.316 "strip_size_kb": 0, 00:27:29.316 "state": "online", 00:27:29.316 "raid_level": "raid1", 00:27:29.316 "superblock": true, 00:27:29.316 "num_base_bdevs": 2, 00:27:29.316 "num_base_bdevs_discovered": 2, 00:27:29.316 "num_base_bdevs_operational": 2, 00:27:29.316 "base_bdevs_list": [ 00:27:29.316 { 00:27:29.316 "name": "BaseBdev1", 00:27:29.316 "uuid": "540f0f49-391a-4ea8-8ed9-0d6627d7912c", 00:27:29.316 "is_configured": true, 00:27:29.316 "data_offset": 256, 00:27:29.316 "data_size": 7936 00:27:29.316 }, 00:27:29.316 { 00:27:29.316 "name": "BaseBdev2", 00:27:29.316 "uuid": "667a57b1-1fc7-4495-ae09-706e47abd55d", 00:27:29.316 "is_configured": true, 00:27:29.316 "data_offset": 256, 00:27:29.316 "data_size": 7936 00:27:29.316 } 00:27:29.316 ] 00:27:29.316 } 00:27:29.316 } 00:27:29.316 }' 00:27:29.316 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:29.316 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:29.316 BaseBdev2' 00:27:29.316 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:29.316 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:29.316 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:29.575 "name": "BaseBdev1", 00:27:29.575 "aliases": [ 00:27:29.575 "540f0f49-391a-4ea8-8ed9-0d6627d7912c" 00:27:29.575 ], 00:27:29.575 "product_name": "Malloc disk", 00:27:29.575 "block_size": 4096, 00:27:29.575 "num_blocks": 8192, 00:27:29.575 "uuid": "540f0f49-391a-4ea8-8ed9-0d6627d7912c", 00:27:29.575 "assigned_rate_limits": { 00:27:29.575 "rw_ios_per_sec": 0, 00:27:29.575 "rw_mbytes_per_sec": 0, 00:27:29.575 "r_mbytes_per_sec": 0, 00:27:29.575 "w_mbytes_per_sec": 0 00:27:29.575 }, 00:27:29.575 "claimed": true, 00:27:29.575 "claim_type": "exclusive_write", 00:27:29.575 "zoned": false, 00:27:29.575 "supported_io_types": { 00:27:29.575 "read": true, 00:27:29.575 "write": true, 00:27:29.575 "unmap": true, 00:27:29.575 "flush": true, 00:27:29.575 "reset": true, 00:27:29.575 "nvme_admin": false, 00:27:29.575 "nvme_io": false, 00:27:29.575 "nvme_io_md": false, 00:27:29.575 "write_zeroes": true, 00:27:29.575 "zcopy": true, 00:27:29.575 "get_zone_info": false, 00:27:29.575 "zone_management": false, 00:27:29.575 "zone_append": false, 00:27:29.575 "compare": false, 00:27:29.575 "compare_and_write": false, 00:27:29.575 "abort": true, 00:27:29.575 "seek_hole": false, 00:27:29.575 "seek_data": false, 00:27:29.575 "copy": true, 00:27:29.575 "nvme_iov_md": false 00:27:29.575 }, 00:27:29.575 "memory_domains": [ 00:27:29.575 { 00:27:29.575 "dma_device_id": "system", 00:27:29.575 "dma_device_type": 1 00:27:29.575 }, 00:27:29.575 { 00:27:29.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:29.575 "dma_device_type": 2 00:27:29.575 } 00:27:29.575 ], 00:27:29.575 "driver_specific": {} 00:27:29.575 }' 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:29.575 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:29.834 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:30.092 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:30.092 "name": "BaseBdev2", 00:27:30.092 "aliases": [ 00:27:30.092 "667a57b1-1fc7-4495-ae09-706e47abd55d" 00:27:30.092 ], 00:27:30.092 "product_name": "Malloc disk", 00:27:30.092 "block_size": 4096, 00:27:30.092 "num_blocks": 8192, 00:27:30.092 "uuid": "667a57b1-1fc7-4495-ae09-706e47abd55d", 00:27:30.092 "assigned_rate_limits": { 00:27:30.092 "rw_ios_per_sec": 0, 00:27:30.092 "rw_mbytes_per_sec": 0, 00:27:30.092 "r_mbytes_per_sec": 0, 00:27:30.092 "w_mbytes_per_sec": 0 00:27:30.092 }, 00:27:30.092 "claimed": true, 00:27:30.092 "claim_type": "exclusive_write", 00:27:30.092 "zoned": false, 00:27:30.092 "supported_io_types": { 00:27:30.092 "read": true, 00:27:30.092 "write": true, 00:27:30.092 "unmap": true, 00:27:30.092 "flush": true, 00:27:30.092 "reset": true, 00:27:30.092 "nvme_admin": false, 00:27:30.092 "nvme_io": false, 00:27:30.092 "nvme_io_md": false, 00:27:30.092 "write_zeroes": true, 00:27:30.092 "zcopy": true, 00:27:30.092 "get_zone_info": false, 00:27:30.092 "zone_management": false, 00:27:30.092 "zone_append": false, 00:27:30.092 "compare": false, 00:27:30.092 "compare_and_write": false, 00:27:30.092 "abort": true, 00:27:30.092 "seek_hole": false, 00:27:30.092 "seek_data": false, 00:27:30.092 "copy": true, 00:27:30.092 "nvme_iov_md": false 00:27:30.092 }, 00:27:30.092 "memory_domains": [ 00:27:30.092 { 00:27:30.092 "dma_device_id": "system", 00:27:30.092 "dma_device_type": 1 00:27:30.092 }, 00:27:30.092 { 00:27:30.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:30.092 "dma_device_type": 2 00:27:30.092 } 00:27:30.092 ], 00:27:30.092 "driver_specific": {} 00:27:30.092 }' 00:27:30.092 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:30.092 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:30.092 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:30.092 22:56:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:30.351 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:30.610 [2024-07-15 22:56:15.476348] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:30.610 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:30.611 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.611 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:30.870 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:30.870 "name": "Existed_Raid", 00:27:30.870 "uuid": "14383702-2db4-448b-874a-fdef3a7f8d7b", 00:27:30.870 "strip_size_kb": 0, 00:27:30.870 "state": "online", 00:27:30.870 "raid_level": "raid1", 00:27:30.870 "superblock": true, 00:27:30.870 "num_base_bdevs": 2, 00:27:30.870 "num_base_bdevs_discovered": 1, 00:27:30.870 "num_base_bdevs_operational": 1, 00:27:30.870 "base_bdevs_list": [ 00:27:30.870 { 00:27:30.870 "name": null, 00:27:30.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.870 "is_configured": false, 00:27:30.870 "data_offset": 256, 00:27:30.870 "data_size": 7936 00:27:30.870 }, 00:27:30.870 { 00:27:30.870 "name": "BaseBdev2", 00:27:30.870 "uuid": "667a57b1-1fc7-4495-ae09-706e47abd55d", 00:27:30.870 "is_configured": true, 00:27:30.870 "data_offset": 256, 00:27:30.870 "data_size": 7936 00:27:30.870 } 00:27:30.870 ] 00:27:30.870 }' 00:27:30.870 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:30.870 22:56:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:31.437 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:31.437 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:31.437 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.437 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:31.696 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:31.696 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:31.696 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:31.955 [2024-07-15 22:56:16.817054] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:31.955 [2024-07-15 22:56:16.817155] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:31.955 [2024-07-15 22:56:16.829907] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:31.955 [2024-07-15 22:56:16.829952] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:31.955 [2024-07-15 22:56:16.829965] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb7000 name Existed_Raid, state offline 00:27:31.955 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:31.955 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:31.955 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:31.955 22:56:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.213 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:32.213 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:32.213 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:32.214 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2840402 00:27:32.214 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2840402 ']' 00:27:32.214 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2840402 00:27:32.214 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:32.214 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:32.473 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2840402 00:27:32.473 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:32.473 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:32.473 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2840402' 00:27:32.473 killing process with pid 2840402 00:27:32.473 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2840402 00:27:32.473 [2024-07-15 22:56:17.162202] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:32.473 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2840402 00:27:32.473 [2024-07-15 22:56:17.163093] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:32.732 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:32.732 00:27:32.732 real 0m10.703s 00:27:32.732 user 0m19.061s 00:27:32.732 sys 0m1.994s 00:27:32.732 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:32.732 22:56:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:32.732 ************************************ 00:27:32.732 END TEST raid_state_function_test_sb_4k 00:27:32.732 ************************************ 00:27:32.732 22:56:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:32.732 22:56:17 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:32.732 22:56:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:32.732 22:56:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:32.732 22:56:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:32.732 ************************************ 00:27:32.732 START TEST raid_superblock_test_4k 00:27:32.732 ************************************ 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2842027 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2842027 /var/tmp/spdk-raid.sock 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 2842027 ']' 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:32.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:32.732 22:56:17 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:32.733 [2024-07-15 22:56:17.521085] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:27:32.733 [2024-07-15 22:56:17.521134] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2842027 ] 00:27:32.733 [2024-07-15 22:56:17.625085] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:32.991 [2024-07-15 22:56:17.737725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:32.991 [2024-07-15 22:56:17.798631] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:32.991 [2024-07-15 22:56:17.798659] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:33.926 22:56:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:33.926 22:56:18 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:27:33.926 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:33.926 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:33.926 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:33.926 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:33.927 malloc1 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:33.927 [2024-07-15 22:56:18.807579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:33.927 [2024-07-15 22:56:18.807632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:33.927 [2024-07-15 22:56:18.807651] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19fb570 00:27:33.927 [2024-07-15 22:56:18.807663] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:33.927 [2024-07-15 22:56:18.809318] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:33.927 [2024-07-15 22:56:18.809347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:33.927 pt1 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:33.927 22:56:18 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:34.186 malloc2 00:27:34.186 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:34.444 [2024-07-15 22:56:19.169450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:34.444 [2024-07-15 22:56:19.169498] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:34.444 [2024-07-15 22:56:19.169516] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19fc970 00:27:34.444 [2024-07-15 22:56:19.169528] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:34.444 [2024-07-15 22:56:19.171031] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:34.444 [2024-07-15 22:56:19.171060] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:34.444 pt2 00:27:34.444 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:34.444 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:34.444 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:34.735 [2024-07-15 22:56:19.353965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:34.735 [2024-07-15 22:56:19.355328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:34.735 [2024-07-15 22:56:19.355476] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b9f270 00:27:34.735 [2024-07-15 22:56:19.355489] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:34.735 [2024-07-15 22:56:19.355690] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19f30e0 00:27:34.735 [2024-07-15 22:56:19.355834] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b9f270 00:27:34.735 [2024-07-15 22:56:19.355844] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b9f270 00:27:34.735 [2024-07-15 22:56:19.355949] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.735 "name": "raid_bdev1", 00:27:34.735 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:34.735 "strip_size_kb": 0, 00:27:34.735 "state": "online", 00:27:34.735 "raid_level": "raid1", 00:27:34.735 "superblock": true, 00:27:34.735 "num_base_bdevs": 2, 00:27:34.735 "num_base_bdevs_discovered": 2, 00:27:34.735 "num_base_bdevs_operational": 2, 00:27:34.735 "base_bdevs_list": [ 00:27:34.735 { 00:27:34.735 "name": "pt1", 00:27:34.735 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:34.735 "is_configured": true, 00:27:34.735 "data_offset": 256, 00:27:34.735 "data_size": 7936 00:27:34.735 }, 00:27:34.735 { 00:27:34.735 "name": "pt2", 00:27:34.735 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:34.735 "is_configured": true, 00:27:34.735 "data_offset": 256, 00:27:34.735 "data_size": 7936 00:27:34.735 } 00:27:34.735 ] 00:27:34.735 }' 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.735 22:56:19 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:35.670 [2024-07-15 22:56:20.477172] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:35.670 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:35.670 "name": "raid_bdev1", 00:27:35.670 "aliases": [ 00:27:35.670 "5a785e90-d252-4024-8f54-fafc7a0f9407" 00:27:35.670 ], 00:27:35.670 "product_name": "Raid Volume", 00:27:35.670 "block_size": 4096, 00:27:35.670 "num_blocks": 7936, 00:27:35.670 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:35.670 "assigned_rate_limits": { 00:27:35.670 "rw_ios_per_sec": 0, 00:27:35.670 "rw_mbytes_per_sec": 0, 00:27:35.670 "r_mbytes_per_sec": 0, 00:27:35.670 "w_mbytes_per_sec": 0 00:27:35.670 }, 00:27:35.670 "claimed": false, 00:27:35.670 "zoned": false, 00:27:35.670 "supported_io_types": { 00:27:35.670 "read": true, 00:27:35.670 "write": true, 00:27:35.670 "unmap": false, 00:27:35.670 "flush": false, 00:27:35.670 "reset": true, 00:27:35.670 "nvme_admin": false, 00:27:35.670 "nvme_io": false, 00:27:35.670 "nvme_io_md": false, 00:27:35.670 "write_zeroes": true, 00:27:35.670 "zcopy": false, 00:27:35.670 "get_zone_info": false, 00:27:35.670 "zone_management": false, 00:27:35.670 "zone_append": false, 00:27:35.670 "compare": false, 00:27:35.670 "compare_and_write": false, 00:27:35.670 "abort": false, 00:27:35.670 "seek_hole": false, 00:27:35.670 "seek_data": false, 00:27:35.670 "copy": false, 00:27:35.670 "nvme_iov_md": false 00:27:35.670 }, 00:27:35.670 "memory_domains": [ 00:27:35.670 { 00:27:35.670 "dma_device_id": "system", 00:27:35.670 "dma_device_type": 1 00:27:35.670 }, 00:27:35.670 { 00:27:35.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.670 "dma_device_type": 2 00:27:35.670 }, 00:27:35.670 { 00:27:35.670 "dma_device_id": "system", 00:27:35.670 "dma_device_type": 1 00:27:35.670 }, 00:27:35.670 { 00:27:35.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.670 "dma_device_type": 2 00:27:35.670 } 00:27:35.670 ], 00:27:35.670 "driver_specific": { 00:27:35.670 "raid": { 00:27:35.670 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:35.670 "strip_size_kb": 0, 00:27:35.670 "state": "online", 00:27:35.670 "raid_level": "raid1", 00:27:35.670 "superblock": true, 00:27:35.671 "num_base_bdevs": 2, 00:27:35.671 "num_base_bdevs_discovered": 2, 00:27:35.671 "num_base_bdevs_operational": 2, 00:27:35.671 "base_bdevs_list": [ 00:27:35.671 { 00:27:35.671 "name": "pt1", 00:27:35.671 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:35.671 "is_configured": true, 00:27:35.671 "data_offset": 256, 00:27:35.671 "data_size": 7936 00:27:35.671 }, 00:27:35.671 { 00:27:35.671 "name": "pt2", 00:27:35.671 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:35.671 "is_configured": true, 00:27:35.671 "data_offset": 256, 00:27:35.671 "data_size": 7936 00:27:35.671 } 00:27:35.671 ] 00:27:35.671 } 00:27:35.671 } 00:27:35.671 }' 00:27:35.671 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:35.671 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:35.671 pt2' 00:27:35.671 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:35.671 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:35.671 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:35.929 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:35.929 "name": "pt1", 00:27:35.929 "aliases": [ 00:27:35.929 "00000000-0000-0000-0000-000000000001" 00:27:35.929 ], 00:27:35.929 "product_name": "passthru", 00:27:35.929 "block_size": 4096, 00:27:35.929 "num_blocks": 8192, 00:27:35.929 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:35.929 "assigned_rate_limits": { 00:27:35.929 "rw_ios_per_sec": 0, 00:27:35.929 "rw_mbytes_per_sec": 0, 00:27:35.929 "r_mbytes_per_sec": 0, 00:27:35.929 "w_mbytes_per_sec": 0 00:27:35.929 }, 00:27:35.929 "claimed": true, 00:27:35.929 "claim_type": "exclusive_write", 00:27:35.929 "zoned": false, 00:27:35.929 "supported_io_types": { 00:27:35.929 "read": true, 00:27:35.929 "write": true, 00:27:35.929 "unmap": true, 00:27:35.929 "flush": true, 00:27:35.929 "reset": true, 00:27:35.929 "nvme_admin": false, 00:27:35.929 "nvme_io": false, 00:27:35.929 "nvme_io_md": false, 00:27:35.929 "write_zeroes": true, 00:27:35.929 "zcopy": true, 00:27:35.929 "get_zone_info": false, 00:27:35.929 "zone_management": false, 00:27:35.929 "zone_append": false, 00:27:35.929 "compare": false, 00:27:35.929 "compare_and_write": false, 00:27:35.929 "abort": true, 00:27:35.929 "seek_hole": false, 00:27:35.929 "seek_data": false, 00:27:35.929 "copy": true, 00:27:35.929 "nvme_iov_md": false 00:27:35.929 }, 00:27:35.929 "memory_domains": [ 00:27:35.929 { 00:27:35.929 "dma_device_id": "system", 00:27:35.929 "dma_device_type": 1 00:27:35.929 }, 00:27:35.929 { 00:27:35.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.929 "dma_device_type": 2 00:27:35.929 } 00:27:35.929 ], 00:27:35.929 "driver_specific": { 00:27:35.929 "passthru": { 00:27:35.929 "name": "pt1", 00:27:35.929 "base_bdev_name": "malloc1" 00:27:35.929 } 00:27:35.929 } 00:27:35.929 }' 00:27:35.929 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.187 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.187 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:36.187 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.187 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.187 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:36.187 22:56:20 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.187 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.187 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:36.187 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.444 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.444 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:36.444 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:36.444 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:36.444 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:36.702 "name": "pt2", 00:27:36.702 "aliases": [ 00:27:36.702 "00000000-0000-0000-0000-000000000002" 00:27:36.702 ], 00:27:36.702 "product_name": "passthru", 00:27:36.702 "block_size": 4096, 00:27:36.702 "num_blocks": 8192, 00:27:36.702 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:36.702 "assigned_rate_limits": { 00:27:36.702 "rw_ios_per_sec": 0, 00:27:36.702 "rw_mbytes_per_sec": 0, 00:27:36.702 "r_mbytes_per_sec": 0, 00:27:36.702 "w_mbytes_per_sec": 0 00:27:36.702 }, 00:27:36.702 "claimed": true, 00:27:36.702 "claim_type": "exclusive_write", 00:27:36.702 "zoned": false, 00:27:36.702 "supported_io_types": { 00:27:36.702 "read": true, 00:27:36.702 "write": true, 00:27:36.702 "unmap": true, 00:27:36.702 "flush": true, 00:27:36.702 "reset": true, 00:27:36.702 "nvme_admin": false, 00:27:36.702 "nvme_io": false, 00:27:36.702 "nvme_io_md": false, 00:27:36.702 "write_zeroes": true, 00:27:36.702 "zcopy": true, 00:27:36.702 "get_zone_info": false, 00:27:36.702 "zone_management": false, 00:27:36.702 "zone_append": false, 00:27:36.702 "compare": false, 00:27:36.702 "compare_and_write": false, 00:27:36.702 "abort": true, 00:27:36.702 "seek_hole": false, 00:27:36.702 "seek_data": false, 00:27:36.702 "copy": true, 00:27:36.702 "nvme_iov_md": false 00:27:36.702 }, 00:27:36.702 "memory_domains": [ 00:27:36.702 { 00:27:36.702 "dma_device_id": "system", 00:27:36.702 "dma_device_type": 1 00:27:36.702 }, 00:27:36.702 { 00:27:36.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:36.702 "dma_device_type": 2 00:27:36.702 } 00:27:36.702 ], 00:27:36.702 "driver_specific": { 00:27:36.702 "passthru": { 00:27:36.702 "name": "pt2", 00:27:36.702 "base_bdev_name": "malloc2" 00:27:36.702 } 00:27:36.702 } 00:27:36.702 }' 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:36.702 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.960 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.960 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:36.960 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.960 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.960 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:36.960 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:36.960 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:37.217 [2024-07-15 22:56:21.969143] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:37.217 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=5a785e90-d252-4024-8f54-fafc7a0f9407 00:27:37.217 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 5a785e90-d252-4024-8f54-fafc7a0f9407 ']' 00:27:37.217 22:56:21 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:37.474 [2024-07-15 22:56:22.201490] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:37.474 [2024-07-15 22:56:22.201511] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:37.474 [2024-07-15 22:56:22.201570] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:37.474 [2024-07-15 22:56:22.201627] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:37.474 [2024-07-15 22:56:22.201640] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b9f270 name raid_bdev1, state offline 00:27:37.474 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.474 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:37.732 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:37.732 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:37.732 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:37.732 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:37.989 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:37.989 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:38.247 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:38.247 22:56:22 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:38.505 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:38.762 [2024-07-15 22:56:23.420664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:38.762 [2024-07-15 22:56:23.422032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:38.762 [2024-07-15 22:56:23.422091] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:38.762 [2024-07-15 22:56:23.422133] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:38.762 [2024-07-15 22:56:23.422152] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:38.762 [2024-07-15 22:56:23.422162] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b9eff0 name raid_bdev1, state configuring 00:27:38.762 request: 00:27:38.762 { 00:27:38.762 "name": "raid_bdev1", 00:27:38.762 "raid_level": "raid1", 00:27:38.762 "base_bdevs": [ 00:27:38.762 "malloc1", 00:27:38.762 "malloc2" 00:27:38.762 ], 00:27:38.762 "superblock": false, 00:27:38.762 "method": "bdev_raid_create", 00:27:38.762 "req_id": 1 00:27:38.762 } 00:27:38.762 Got JSON-RPC error response 00:27:38.762 response: 00:27:38.762 { 00:27:38.762 "code": -17, 00:27:38.762 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:38.762 } 00:27:38.762 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:27:38.762 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:38.762 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:38.762 22:56:23 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:38.762 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.762 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:39.020 [2024-07-15 22:56:23.905898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:39.020 [2024-07-15 22:56:23.905955] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:39.020 [2024-07-15 22:56:23.905978] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19fb7a0 00:27:39.020 [2024-07-15 22:56:23.905991] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:39.020 [2024-07-15 22:56:23.907595] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:39.020 [2024-07-15 22:56:23.907628] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:39.020 [2024-07-15 22:56:23.907699] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:39.020 [2024-07-15 22:56:23.907726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:39.020 pt1 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.020 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.279 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.279 22:56:23 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.279 22:56:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:39.279 "name": "raid_bdev1", 00:27:39.279 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:39.279 "strip_size_kb": 0, 00:27:39.279 "state": "configuring", 00:27:39.279 "raid_level": "raid1", 00:27:39.279 "superblock": true, 00:27:39.279 "num_base_bdevs": 2, 00:27:39.279 "num_base_bdevs_discovered": 1, 00:27:39.279 "num_base_bdevs_operational": 2, 00:27:39.279 "base_bdevs_list": [ 00:27:39.279 { 00:27:39.279 "name": "pt1", 00:27:39.279 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:39.279 "is_configured": true, 00:27:39.279 "data_offset": 256, 00:27:39.279 "data_size": 7936 00:27:39.279 }, 00:27:39.279 { 00:27:39.279 "name": null, 00:27:39.279 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:39.279 "is_configured": false, 00:27:39.279 "data_offset": 256, 00:27:39.279 "data_size": 7936 00:27:39.279 } 00:27:39.279 ] 00:27:39.279 }' 00:27:39.279 22:56:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:39.279 22:56:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:40.215 22:56:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:40.215 22:56:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:40.215 22:56:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:40.215 22:56:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:40.215 [2024-07-15 22:56:24.992804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:40.215 [2024-07-15 22:56:24.992855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:40.216 [2024-07-15 22:56:24.992875] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b936f0 00:27:40.216 [2024-07-15 22:56:24.992889] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:40.216 [2024-07-15 22:56:24.993258] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:40.216 [2024-07-15 22:56:24.993279] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:40.216 [2024-07-15 22:56:24.993344] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:40.216 [2024-07-15 22:56:24.993363] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:40.216 [2024-07-15 22:56:24.993466] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b94590 00:27:40.216 [2024-07-15 22:56:24.993477] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:40.216 [2024-07-15 22:56:24.993646] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19f5540 00:27:40.216 [2024-07-15 22:56:24.993772] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b94590 00:27:40.216 [2024-07-15 22:56:24.993783] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b94590 00:27:40.216 [2024-07-15 22:56:24.993879] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:40.216 pt2 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.216 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.475 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:40.475 "name": "raid_bdev1", 00:27:40.475 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:40.475 "strip_size_kb": 0, 00:27:40.475 "state": "online", 00:27:40.475 "raid_level": "raid1", 00:27:40.475 "superblock": true, 00:27:40.475 "num_base_bdevs": 2, 00:27:40.475 "num_base_bdevs_discovered": 2, 00:27:40.475 "num_base_bdevs_operational": 2, 00:27:40.475 "base_bdevs_list": [ 00:27:40.475 { 00:27:40.475 "name": "pt1", 00:27:40.475 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:40.475 "is_configured": true, 00:27:40.475 "data_offset": 256, 00:27:40.475 "data_size": 7936 00:27:40.475 }, 00:27:40.475 { 00:27:40.475 "name": "pt2", 00:27:40.475 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:40.475 "is_configured": true, 00:27:40.475 "data_offset": 256, 00:27:40.475 "data_size": 7936 00:27:40.475 } 00:27:40.475 ] 00:27:40.475 }' 00:27:40.475 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:40.475 22:56:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:41.042 22:56:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:41.301 [2024-07-15 22:56:26.095986] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:41.301 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:41.301 "name": "raid_bdev1", 00:27:41.301 "aliases": [ 00:27:41.301 "5a785e90-d252-4024-8f54-fafc7a0f9407" 00:27:41.301 ], 00:27:41.301 "product_name": "Raid Volume", 00:27:41.301 "block_size": 4096, 00:27:41.301 "num_blocks": 7936, 00:27:41.301 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:41.301 "assigned_rate_limits": { 00:27:41.301 "rw_ios_per_sec": 0, 00:27:41.301 "rw_mbytes_per_sec": 0, 00:27:41.301 "r_mbytes_per_sec": 0, 00:27:41.301 "w_mbytes_per_sec": 0 00:27:41.301 }, 00:27:41.301 "claimed": false, 00:27:41.301 "zoned": false, 00:27:41.301 "supported_io_types": { 00:27:41.301 "read": true, 00:27:41.301 "write": true, 00:27:41.301 "unmap": false, 00:27:41.301 "flush": false, 00:27:41.301 "reset": true, 00:27:41.301 "nvme_admin": false, 00:27:41.301 "nvme_io": false, 00:27:41.301 "nvme_io_md": false, 00:27:41.301 "write_zeroes": true, 00:27:41.301 "zcopy": false, 00:27:41.301 "get_zone_info": false, 00:27:41.301 "zone_management": false, 00:27:41.301 "zone_append": false, 00:27:41.301 "compare": false, 00:27:41.301 "compare_and_write": false, 00:27:41.301 "abort": false, 00:27:41.301 "seek_hole": false, 00:27:41.301 "seek_data": false, 00:27:41.301 "copy": false, 00:27:41.301 "nvme_iov_md": false 00:27:41.301 }, 00:27:41.301 "memory_domains": [ 00:27:41.301 { 00:27:41.301 "dma_device_id": "system", 00:27:41.301 "dma_device_type": 1 00:27:41.301 }, 00:27:41.301 { 00:27:41.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.301 "dma_device_type": 2 00:27:41.301 }, 00:27:41.301 { 00:27:41.301 "dma_device_id": "system", 00:27:41.301 "dma_device_type": 1 00:27:41.301 }, 00:27:41.301 { 00:27:41.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.301 "dma_device_type": 2 00:27:41.301 } 00:27:41.301 ], 00:27:41.301 "driver_specific": { 00:27:41.301 "raid": { 00:27:41.301 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:41.301 "strip_size_kb": 0, 00:27:41.301 "state": "online", 00:27:41.301 "raid_level": "raid1", 00:27:41.301 "superblock": true, 00:27:41.301 "num_base_bdevs": 2, 00:27:41.301 "num_base_bdevs_discovered": 2, 00:27:41.301 "num_base_bdevs_operational": 2, 00:27:41.301 "base_bdevs_list": [ 00:27:41.301 { 00:27:41.301 "name": "pt1", 00:27:41.301 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:41.301 "is_configured": true, 00:27:41.301 "data_offset": 256, 00:27:41.301 "data_size": 7936 00:27:41.301 }, 00:27:41.301 { 00:27:41.301 "name": "pt2", 00:27:41.301 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:41.301 "is_configured": true, 00:27:41.301 "data_offset": 256, 00:27:41.301 "data_size": 7936 00:27:41.301 } 00:27:41.301 ] 00:27:41.301 } 00:27:41.301 } 00:27:41.301 }' 00:27:41.301 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:41.301 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:41.301 pt2' 00:27:41.301 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:41.301 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:41.301 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:41.561 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:41.561 "name": "pt1", 00:27:41.561 "aliases": [ 00:27:41.561 "00000000-0000-0000-0000-000000000001" 00:27:41.561 ], 00:27:41.561 "product_name": "passthru", 00:27:41.561 "block_size": 4096, 00:27:41.561 "num_blocks": 8192, 00:27:41.561 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:41.561 "assigned_rate_limits": { 00:27:41.561 "rw_ios_per_sec": 0, 00:27:41.561 "rw_mbytes_per_sec": 0, 00:27:41.561 "r_mbytes_per_sec": 0, 00:27:41.561 "w_mbytes_per_sec": 0 00:27:41.561 }, 00:27:41.561 "claimed": true, 00:27:41.561 "claim_type": "exclusive_write", 00:27:41.561 "zoned": false, 00:27:41.561 "supported_io_types": { 00:27:41.561 "read": true, 00:27:41.561 "write": true, 00:27:41.561 "unmap": true, 00:27:41.561 "flush": true, 00:27:41.561 "reset": true, 00:27:41.561 "nvme_admin": false, 00:27:41.561 "nvme_io": false, 00:27:41.561 "nvme_io_md": false, 00:27:41.561 "write_zeroes": true, 00:27:41.561 "zcopy": true, 00:27:41.561 "get_zone_info": false, 00:27:41.561 "zone_management": false, 00:27:41.561 "zone_append": false, 00:27:41.561 "compare": false, 00:27:41.561 "compare_and_write": false, 00:27:41.561 "abort": true, 00:27:41.561 "seek_hole": false, 00:27:41.561 "seek_data": false, 00:27:41.561 "copy": true, 00:27:41.561 "nvme_iov_md": false 00:27:41.561 }, 00:27:41.561 "memory_domains": [ 00:27:41.561 { 00:27:41.561 "dma_device_id": "system", 00:27:41.561 "dma_device_type": 1 00:27:41.561 }, 00:27:41.561 { 00:27:41.561 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.561 "dma_device_type": 2 00:27:41.561 } 00:27:41.561 ], 00:27:41.561 "driver_specific": { 00:27:41.561 "passthru": { 00:27:41.561 "name": "pt1", 00:27:41.561 "base_bdev_name": "malloc1" 00:27:41.561 } 00:27:41.561 } 00:27:41.561 }' 00:27:41.561 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:41.561 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:41.820 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:42.078 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:42.078 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:42.078 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:42.078 22:56:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:42.337 "name": "pt2", 00:27:42.337 "aliases": [ 00:27:42.337 "00000000-0000-0000-0000-000000000002" 00:27:42.337 ], 00:27:42.337 "product_name": "passthru", 00:27:42.337 "block_size": 4096, 00:27:42.337 "num_blocks": 8192, 00:27:42.337 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:42.337 "assigned_rate_limits": { 00:27:42.337 "rw_ios_per_sec": 0, 00:27:42.337 "rw_mbytes_per_sec": 0, 00:27:42.337 "r_mbytes_per_sec": 0, 00:27:42.337 "w_mbytes_per_sec": 0 00:27:42.337 }, 00:27:42.337 "claimed": true, 00:27:42.337 "claim_type": "exclusive_write", 00:27:42.337 "zoned": false, 00:27:42.337 "supported_io_types": { 00:27:42.337 "read": true, 00:27:42.337 "write": true, 00:27:42.337 "unmap": true, 00:27:42.337 "flush": true, 00:27:42.337 "reset": true, 00:27:42.337 "nvme_admin": false, 00:27:42.337 "nvme_io": false, 00:27:42.337 "nvme_io_md": false, 00:27:42.337 "write_zeroes": true, 00:27:42.337 "zcopy": true, 00:27:42.337 "get_zone_info": false, 00:27:42.337 "zone_management": false, 00:27:42.337 "zone_append": false, 00:27:42.337 "compare": false, 00:27:42.337 "compare_and_write": false, 00:27:42.337 "abort": true, 00:27:42.337 "seek_hole": false, 00:27:42.337 "seek_data": false, 00:27:42.337 "copy": true, 00:27:42.337 "nvme_iov_md": false 00:27:42.337 }, 00:27:42.337 "memory_domains": [ 00:27:42.337 { 00:27:42.337 "dma_device_id": "system", 00:27:42.337 "dma_device_type": 1 00:27:42.337 }, 00:27:42.337 { 00:27:42.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:42.337 "dma_device_type": 2 00:27:42.337 } 00:27:42.337 ], 00:27:42.337 "driver_specific": { 00:27:42.337 "passthru": { 00:27:42.337 "name": "pt2", 00:27:42.337 "base_bdev_name": "malloc2" 00:27:42.337 } 00:27:42.337 } 00:27:42.337 }' 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:42.337 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:42.597 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:42.597 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:42.597 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:42.597 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:42.597 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:42.597 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:42.597 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:42.857 [2024-07-15 22:56:27.615996] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:42.857 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 5a785e90-d252-4024-8f54-fafc7a0f9407 '!=' 5a785e90-d252-4024-8f54-fafc7a0f9407 ']' 00:27:42.857 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:42.857 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:42.857 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:42.857 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:43.116 [2024-07-15 22:56:27.864425] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.116 22:56:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.374 22:56:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.374 "name": "raid_bdev1", 00:27:43.374 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:43.374 "strip_size_kb": 0, 00:27:43.374 "state": "online", 00:27:43.374 "raid_level": "raid1", 00:27:43.374 "superblock": true, 00:27:43.374 "num_base_bdevs": 2, 00:27:43.374 "num_base_bdevs_discovered": 1, 00:27:43.374 "num_base_bdevs_operational": 1, 00:27:43.374 "base_bdevs_list": [ 00:27:43.374 { 00:27:43.374 "name": null, 00:27:43.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.374 "is_configured": false, 00:27:43.374 "data_offset": 256, 00:27:43.374 "data_size": 7936 00:27:43.374 }, 00:27:43.374 { 00:27:43.374 "name": "pt2", 00:27:43.374 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:43.374 "is_configured": true, 00:27:43.374 "data_offset": 256, 00:27:43.374 "data_size": 7936 00:27:43.374 } 00:27:43.374 ] 00:27:43.374 }' 00:27:43.374 22:56:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.374 22:56:28 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:43.940 22:56:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:44.198 [2024-07-15 22:56:28.971347] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:44.198 [2024-07-15 22:56:28.971375] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:44.198 [2024-07-15 22:56:28.971432] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:44.198 [2024-07-15 22:56:28.971473] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:44.198 [2024-07-15 22:56:28.971484] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b94590 name raid_bdev1, state offline 00:27:44.198 22:56:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:44.198 22:56:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.457 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:44.457 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:44.457 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:44.457 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:44.457 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:44.715 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:44.715 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:44.715 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:44.715 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:44.715 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:27:44.715 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:44.974 [2024-07-15 22:56:29.725313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:44.974 [2024-07-15 22:56:29.725360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:44.974 [2024-07-15 22:56:29.725380] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19fc160 00:27:44.974 [2024-07-15 22:56:29.725398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:44.974 [2024-07-15 22:56:29.727013] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:44.974 [2024-07-15 22:56:29.727042] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:44.974 [2024-07-15 22:56:29.727110] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:44.974 [2024-07-15 22:56:29.727138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:44.974 [2024-07-15 22:56:29.727225] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19f2380 00:27:44.974 [2024-07-15 22:56:29.727236] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:44.974 [2024-07-15 22:56:29.727405] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19f3a80 00:27:44.974 [2024-07-15 22:56:29.727528] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19f2380 00:27:44.974 [2024-07-15 22:56:29.727537] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19f2380 00:27:44.974 [2024-07-15 22:56:29.727632] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:44.974 pt2 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.974 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.233 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.233 "name": "raid_bdev1", 00:27:45.233 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:45.233 "strip_size_kb": 0, 00:27:45.233 "state": "online", 00:27:45.233 "raid_level": "raid1", 00:27:45.233 "superblock": true, 00:27:45.233 "num_base_bdevs": 2, 00:27:45.233 "num_base_bdevs_discovered": 1, 00:27:45.233 "num_base_bdevs_operational": 1, 00:27:45.233 "base_bdevs_list": [ 00:27:45.233 { 00:27:45.233 "name": null, 00:27:45.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.233 "is_configured": false, 00:27:45.233 "data_offset": 256, 00:27:45.233 "data_size": 7936 00:27:45.233 }, 00:27:45.233 { 00:27:45.233 "name": "pt2", 00:27:45.233 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:45.233 "is_configured": true, 00:27:45.233 "data_offset": 256, 00:27:45.233 "data_size": 7936 00:27:45.233 } 00:27:45.233 ] 00:27:45.233 }' 00:27:45.233 22:56:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.233 22:56:29 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:45.799 22:56:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:46.058 [2024-07-15 22:56:30.824214] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:46.058 [2024-07-15 22:56:30.824241] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:46.058 [2024-07-15 22:56:30.824302] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:46.058 [2024-07-15 22:56:30.824346] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:46.058 [2024-07-15 22:56:30.824365] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19f2380 name raid_bdev1, state offline 00:27:46.058 22:56:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.058 22:56:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:46.318 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:46.318 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:46.318 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:46.318 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:46.577 [2024-07-15 22:56:31.321502] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:46.577 [2024-07-15 22:56:31.321546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:46.577 [2024-07-15 22:56:31.321564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b9e520 00:27:46.577 [2024-07-15 22:56:31.321576] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:46.577 [2024-07-15 22:56:31.323170] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:46.577 [2024-07-15 22:56:31.323198] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:46.577 [2024-07-15 22:56:31.323263] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:46.577 [2024-07-15 22:56:31.323288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:46.577 [2024-07-15 22:56:31.323383] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:46.577 [2024-07-15 22:56:31.323397] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:46.577 [2024-07-15 22:56:31.323410] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19f33f0 name raid_bdev1, state configuring 00:27:46.577 [2024-07-15 22:56:31.323434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:46.577 [2024-07-15 22:56:31.323491] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19f52b0 00:27:46.577 [2024-07-15 22:56:31.323501] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:46.577 [2024-07-15 22:56:31.323659] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x19f2350 00:27:46.577 [2024-07-15 22:56:31.323777] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19f52b0 00:27:46.577 [2024-07-15 22:56:31.323787] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x19f52b0 00:27:46.577 [2024-07-15 22:56:31.323883] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:46.577 pt1 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.577 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.835 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:46.835 "name": "raid_bdev1", 00:27:46.835 "uuid": "5a785e90-d252-4024-8f54-fafc7a0f9407", 00:27:46.835 "strip_size_kb": 0, 00:27:46.835 "state": "online", 00:27:46.835 "raid_level": "raid1", 00:27:46.835 "superblock": true, 00:27:46.835 "num_base_bdevs": 2, 00:27:46.835 "num_base_bdevs_discovered": 1, 00:27:46.835 "num_base_bdevs_operational": 1, 00:27:46.835 "base_bdevs_list": [ 00:27:46.835 { 00:27:46.835 "name": null, 00:27:46.835 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.835 "is_configured": false, 00:27:46.835 "data_offset": 256, 00:27:46.835 "data_size": 7936 00:27:46.835 }, 00:27:46.835 { 00:27:46.835 "name": "pt2", 00:27:46.835 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:46.835 "is_configured": true, 00:27:46.835 "data_offset": 256, 00:27:46.835 "data_size": 7936 00:27:46.835 } 00:27:46.835 ] 00:27:46.835 }' 00:27:46.835 22:56:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:46.836 22:56:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:47.402 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:47.402 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:47.661 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:47.661 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:47.661 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:47.920 [2024-07-15 22:56:32.653273] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 5a785e90-d252-4024-8f54-fafc7a0f9407 '!=' 5a785e90-d252-4024-8f54-fafc7a0f9407 ']' 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2842027 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 2842027 ']' 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 2842027 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2842027 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2842027' 00:27:47.920 killing process with pid 2842027 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 2842027 00:27:47.920 [2024-07-15 22:56:32.720890] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:47.920 [2024-07-15 22:56:32.720951] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:47.920 [2024-07-15 22:56:32.720994] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:47.920 [2024-07-15 22:56:32.721006] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19f52b0 name raid_bdev1, state offline 00:27:47.920 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 2842027 00:27:47.920 [2024-07-15 22:56:32.737406] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:48.180 22:56:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:27:48.180 00:27:48.180 real 0m15.469s 00:27:48.180 user 0m28.104s 00:27:48.180 sys 0m2.867s 00:27:48.180 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:48.180 22:56:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:48.180 ************************************ 00:27:48.180 END TEST raid_superblock_test_4k 00:27:48.180 ************************************ 00:27:48.180 22:56:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:48.180 22:56:32 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:27:48.180 22:56:32 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:27:48.180 22:56:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:48.180 22:56:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:48.180 22:56:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:48.180 ************************************ 00:27:48.180 START TEST raid_rebuild_test_sb_4k 00:27:48.180 ************************************ 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2844290 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2844290 /var/tmp/spdk-raid.sock 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2844290 ']' 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:48.180 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:48.180 22:56:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:48.439 [2024-07-15 22:56:33.103752] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:27:48.439 [2024-07-15 22:56:33.103821] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2844290 ] 00:27:48.439 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:48.439 Zero copy mechanism will not be used. 00:27:48.439 [2024-07-15 22:56:33.233343] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:48.439 [2024-07-15 22:56:33.340276] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:48.732 [2024-07-15 22:56:33.406351] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:48.732 [2024-07-15 22:56:33.406389] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:49.304 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:49.304 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:49.304 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:49.304 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:27:49.562 BaseBdev1_malloc 00:27:49.562 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:49.820 [2024-07-15 22:56:34.488475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:49.820 [2024-07-15 22:56:34.488522] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:49.820 [2024-07-15 22:56:34.488546] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb63d40 00:27:49.820 [2024-07-15 22:56:34.488559] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:49.820 [2024-07-15 22:56:34.490264] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:49.820 [2024-07-15 22:56:34.490292] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:49.820 BaseBdev1 00:27:49.820 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:49.820 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:27:50.079 BaseBdev2_malloc 00:27:50.079 22:56:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:50.337 [2024-07-15 22:56:34.990655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:50.337 [2024-07-15 22:56:34.990699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:50.337 [2024-07-15 22:56:34.990722] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb64860 00:27:50.337 [2024-07-15 22:56:34.990735] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:50.337 [2024-07-15 22:56:34.992322] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:50.337 [2024-07-15 22:56:34.992350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:50.337 BaseBdev2 00:27:50.337 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:27:50.337 spare_malloc 00:27:50.594 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:50.594 spare_delay 00:27:50.594 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:50.852 [2024-07-15 22:56:35.714394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:50.852 [2024-07-15 22:56:35.714439] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:50.852 [2024-07-15 22:56:35.714460] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd12ec0 00:27:50.852 [2024-07-15 22:56:35.714472] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:50.852 [2024-07-15 22:56:35.716094] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:50.852 [2024-07-15 22:56:35.716124] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:50.852 spare 00:27:50.852 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:51.110 [2024-07-15 22:56:35.951054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:51.110 [2024-07-15 22:56:35.952422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:51.110 [2024-07-15 22:56:35.952600] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd14070 00:27:51.110 [2024-07-15 22:56:35.952613] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:51.110 [2024-07-15 22:56:35.952815] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd0d490 00:27:51.110 [2024-07-15 22:56:35.952968] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd14070 00:27:51.110 [2024-07-15 22:56:35.952979] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd14070 00:27:51.110 [2024-07-15 22:56:35.953084] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.110 22:56:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:51.368 22:56:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:51.368 "name": "raid_bdev1", 00:27:51.368 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:27:51.368 "strip_size_kb": 0, 00:27:51.368 "state": "online", 00:27:51.368 "raid_level": "raid1", 00:27:51.368 "superblock": true, 00:27:51.368 "num_base_bdevs": 2, 00:27:51.368 "num_base_bdevs_discovered": 2, 00:27:51.368 "num_base_bdevs_operational": 2, 00:27:51.368 "base_bdevs_list": [ 00:27:51.368 { 00:27:51.368 "name": "BaseBdev1", 00:27:51.368 "uuid": "18fac9da-d8ff-5c71-b561-fce5b43e9a1a", 00:27:51.368 "is_configured": true, 00:27:51.368 "data_offset": 256, 00:27:51.368 "data_size": 7936 00:27:51.368 }, 00:27:51.368 { 00:27:51.368 "name": "BaseBdev2", 00:27:51.368 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:27:51.368 "is_configured": true, 00:27:51.368 "data_offset": 256, 00:27:51.368 "data_size": 7936 00:27:51.368 } 00:27:51.368 ] 00:27:51.368 }' 00:27:51.368 22:56:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:51.368 22:56:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:51.933 22:56:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:51.933 22:56:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:52.190 [2024-07-15 22:56:37.042150] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:52.190 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:52.190 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.190 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:52.448 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:52.707 [2024-07-15 22:56:37.531238] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd0d490 00:27:52.707 /dev/nbd0 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:52.707 1+0 records in 00:27:52.707 1+0 records out 00:27:52.707 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027911 s, 14.7 MB/s 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:52.707 22:56:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:53.638 7936+0 records in 00:27:53.638 7936+0 records out 00:27:53.638 32505856 bytes (33 MB, 31 MiB) copied, 0.759732 s, 42.8 MB/s 00:27:53.638 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:53.638 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:53.638 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:53.638 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:53.638 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:53.638 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:53.638 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:53.896 [2024-07-15 22:56:38.627267] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:53.896 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:54.154 [2024-07-15 22:56:38.855920] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.154 22:56:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.412 22:56:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.412 "name": "raid_bdev1", 00:27:54.412 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:27:54.412 "strip_size_kb": 0, 00:27:54.412 "state": "online", 00:27:54.412 "raid_level": "raid1", 00:27:54.412 "superblock": true, 00:27:54.412 "num_base_bdevs": 2, 00:27:54.412 "num_base_bdevs_discovered": 1, 00:27:54.412 "num_base_bdevs_operational": 1, 00:27:54.412 "base_bdevs_list": [ 00:27:54.412 { 00:27:54.412 "name": null, 00:27:54.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.412 "is_configured": false, 00:27:54.412 "data_offset": 256, 00:27:54.412 "data_size": 7936 00:27:54.412 }, 00:27:54.412 { 00:27:54.412 "name": "BaseBdev2", 00:27:54.412 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:27:54.412 "is_configured": true, 00:27:54.412 "data_offset": 256, 00:27:54.412 "data_size": 7936 00:27:54.412 } 00:27:54.412 ] 00:27:54.412 }' 00:27:54.412 22:56:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.412 22:56:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:54.979 22:56:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:55.237 [2024-07-15 22:56:39.942801] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:55.237 [2024-07-15 22:56:39.947716] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd13ce0 00:27:55.237 [2024-07-15 22:56:39.949915] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:55.237 22:56:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:56.172 22:56:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:56.172 22:56:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:56.172 22:56:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:56.172 22:56:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:56.172 22:56:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:56.172 22:56:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.172 22:56:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.429 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:56.429 "name": "raid_bdev1", 00:27:56.429 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:27:56.429 "strip_size_kb": 0, 00:27:56.429 "state": "online", 00:27:56.429 "raid_level": "raid1", 00:27:56.429 "superblock": true, 00:27:56.429 "num_base_bdevs": 2, 00:27:56.429 "num_base_bdevs_discovered": 2, 00:27:56.429 "num_base_bdevs_operational": 2, 00:27:56.429 "process": { 00:27:56.429 "type": "rebuild", 00:27:56.429 "target": "spare", 00:27:56.429 "progress": { 00:27:56.429 "blocks": 3072, 00:27:56.429 "percent": 38 00:27:56.429 } 00:27:56.429 }, 00:27:56.429 "base_bdevs_list": [ 00:27:56.429 { 00:27:56.429 "name": "spare", 00:27:56.429 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:27:56.429 "is_configured": true, 00:27:56.429 "data_offset": 256, 00:27:56.429 "data_size": 7936 00:27:56.429 }, 00:27:56.429 { 00:27:56.429 "name": "BaseBdev2", 00:27:56.429 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:27:56.429 "is_configured": true, 00:27:56.429 "data_offset": 256, 00:27:56.429 "data_size": 7936 00:27:56.429 } 00:27:56.429 ] 00:27:56.429 }' 00:27:56.429 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:56.429 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:56.429 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:56.429 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:56.429 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:56.688 [2024-07-15 22:56:41.544759] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:56.688 [2024-07-15 22:56:41.562165] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:56.688 [2024-07-15 22:56:41.562211] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:56.688 [2024-07-15 22:56:41.562227] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:56.688 [2024-07-15 22:56:41.562235] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:56.688 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:56.688 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:56.946 "name": "raid_bdev1", 00:27:56.946 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:27:56.946 "strip_size_kb": 0, 00:27:56.946 "state": "online", 00:27:56.946 "raid_level": "raid1", 00:27:56.946 "superblock": true, 00:27:56.946 "num_base_bdevs": 2, 00:27:56.946 "num_base_bdevs_discovered": 1, 00:27:56.946 "num_base_bdevs_operational": 1, 00:27:56.946 "base_bdevs_list": [ 00:27:56.946 { 00:27:56.946 "name": null, 00:27:56.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:56.946 "is_configured": false, 00:27:56.946 "data_offset": 256, 00:27:56.946 "data_size": 7936 00:27:56.946 }, 00:27:56.946 { 00:27:56.946 "name": "BaseBdev2", 00:27:56.946 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:27:56.946 "is_configured": true, 00:27:56.946 "data_offset": 256, 00:27:56.946 "data_size": 7936 00:27:56.946 } 00:27:56.946 ] 00:27:56.946 }' 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:56.946 22:56:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:57.880 "name": "raid_bdev1", 00:27:57.880 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:27:57.880 "strip_size_kb": 0, 00:27:57.880 "state": "online", 00:27:57.880 "raid_level": "raid1", 00:27:57.880 "superblock": true, 00:27:57.880 "num_base_bdevs": 2, 00:27:57.880 "num_base_bdevs_discovered": 1, 00:27:57.880 "num_base_bdevs_operational": 1, 00:27:57.880 "base_bdevs_list": [ 00:27:57.880 { 00:27:57.880 "name": null, 00:27:57.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.880 "is_configured": false, 00:27:57.880 "data_offset": 256, 00:27:57.880 "data_size": 7936 00:27:57.880 }, 00:27:57.880 { 00:27:57.880 "name": "BaseBdev2", 00:27:57.880 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:27:57.880 "is_configured": true, 00:27:57.880 "data_offset": 256, 00:27:57.880 "data_size": 7936 00:27:57.880 } 00:27:57.880 ] 00:27:57.880 }' 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:57.880 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:58.137 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:58.137 22:56:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:58.137 [2024-07-15 22:56:42.994382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:58.137 [2024-07-15 22:56:42.999333] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd13ce0 00:27:58.137 [2024-07-15 22:56:43.000788] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:58.137 22:56:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:59.511 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.512 "name": "raid_bdev1", 00:27:59.512 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:27:59.512 "strip_size_kb": 0, 00:27:59.512 "state": "online", 00:27:59.512 "raid_level": "raid1", 00:27:59.512 "superblock": true, 00:27:59.512 "num_base_bdevs": 2, 00:27:59.512 "num_base_bdevs_discovered": 2, 00:27:59.512 "num_base_bdevs_operational": 2, 00:27:59.512 "process": { 00:27:59.512 "type": "rebuild", 00:27:59.512 "target": "spare", 00:27:59.512 "progress": { 00:27:59.512 "blocks": 3072, 00:27:59.512 "percent": 38 00:27:59.512 } 00:27:59.512 }, 00:27:59.512 "base_bdevs_list": [ 00:27:59.512 { 00:27:59.512 "name": "spare", 00:27:59.512 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:27:59.512 "is_configured": true, 00:27:59.512 "data_offset": 256, 00:27:59.512 "data_size": 7936 00:27:59.512 }, 00:27:59.512 { 00:27:59.512 "name": "BaseBdev2", 00:27:59.512 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:27:59.512 "is_configured": true, 00:27:59.512 "data_offset": 256, 00:27:59.512 "data_size": 7936 00:27:59.512 } 00:27:59.512 ] 00:27:59.512 }' 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:59.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1052 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.512 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.771 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.771 "name": "raid_bdev1", 00:27:59.771 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:27:59.771 "strip_size_kb": 0, 00:27:59.771 "state": "online", 00:27:59.771 "raid_level": "raid1", 00:27:59.771 "superblock": true, 00:27:59.771 "num_base_bdevs": 2, 00:27:59.771 "num_base_bdevs_discovered": 2, 00:27:59.771 "num_base_bdevs_operational": 2, 00:27:59.771 "process": { 00:27:59.771 "type": "rebuild", 00:27:59.771 "target": "spare", 00:27:59.771 "progress": { 00:27:59.771 "blocks": 3840, 00:27:59.771 "percent": 48 00:27:59.771 } 00:27:59.771 }, 00:27:59.771 "base_bdevs_list": [ 00:27:59.771 { 00:27:59.771 "name": "spare", 00:27:59.771 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:27:59.771 "is_configured": true, 00:27:59.771 "data_offset": 256, 00:27:59.771 "data_size": 7936 00:27:59.771 }, 00:27:59.771 { 00:27:59.771 "name": "BaseBdev2", 00:27:59.771 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:27:59.771 "is_configured": true, 00:27:59.771 "data_offset": 256, 00:27:59.771 "data_size": 7936 00:27:59.771 } 00:27:59.771 ] 00:27:59.771 }' 00:27:59.771 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.771 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:59.771 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.029 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:00.029 22:56:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.965 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.224 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:01.224 "name": "raid_bdev1", 00:28:01.224 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:01.224 "strip_size_kb": 0, 00:28:01.224 "state": "online", 00:28:01.224 "raid_level": "raid1", 00:28:01.224 "superblock": true, 00:28:01.224 "num_base_bdevs": 2, 00:28:01.224 "num_base_bdevs_discovered": 2, 00:28:01.224 "num_base_bdevs_operational": 2, 00:28:01.224 "process": { 00:28:01.224 "type": "rebuild", 00:28:01.224 "target": "spare", 00:28:01.224 "progress": { 00:28:01.224 "blocks": 7424, 00:28:01.224 "percent": 93 00:28:01.224 } 00:28:01.224 }, 00:28:01.224 "base_bdevs_list": [ 00:28:01.224 { 00:28:01.224 "name": "spare", 00:28:01.224 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:01.224 "is_configured": true, 00:28:01.224 "data_offset": 256, 00:28:01.224 "data_size": 7936 00:28:01.224 }, 00:28:01.224 { 00:28:01.224 "name": "BaseBdev2", 00:28:01.224 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:01.224 "is_configured": true, 00:28:01.224 "data_offset": 256, 00:28:01.224 "data_size": 7936 00:28:01.224 } 00:28:01.224 ] 00:28:01.224 }' 00:28:01.224 22:56:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:01.224 22:56:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:01.224 22:56:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:01.224 22:56:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:01.224 22:56:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:01.224 [2024-07-15 22:56:46.125070] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:01.224 [2024-07-15 22:56:46.125127] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:01.224 [2024-07-15 22:56:46.125210] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:02.778 "name": "raid_bdev1", 00:28:02.778 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:02.778 "strip_size_kb": 0, 00:28:02.778 "state": "online", 00:28:02.778 "raid_level": "raid1", 00:28:02.778 "superblock": true, 00:28:02.778 "num_base_bdevs": 2, 00:28:02.778 "num_base_bdevs_discovered": 2, 00:28:02.778 "num_base_bdevs_operational": 2, 00:28:02.778 "base_bdevs_list": [ 00:28:02.778 { 00:28:02.778 "name": "spare", 00:28:02.778 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:02.778 "is_configured": true, 00:28:02.778 "data_offset": 256, 00:28:02.778 "data_size": 7936 00:28:02.778 }, 00:28:02.778 { 00:28:02.778 "name": "BaseBdev2", 00:28:02.778 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:02.778 "is_configured": true, 00:28:02.778 "data_offset": 256, 00:28:02.778 "data_size": 7936 00:28:02.778 } 00:28:02.778 ] 00:28:02.778 }' 00:28:02.778 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.060 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.318 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:03.318 "name": "raid_bdev1", 00:28:03.318 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:03.318 "strip_size_kb": 0, 00:28:03.318 "state": "online", 00:28:03.318 "raid_level": "raid1", 00:28:03.318 "superblock": true, 00:28:03.318 "num_base_bdevs": 2, 00:28:03.318 "num_base_bdevs_discovered": 2, 00:28:03.318 "num_base_bdevs_operational": 2, 00:28:03.318 "base_bdevs_list": [ 00:28:03.318 { 00:28:03.318 "name": "spare", 00:28:03.318 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:03.318 "is_configured": true, 00:28:03.318 "data_offset": 256, 00:28:03.318 "data_size": 7936 00:28:03.318 }, 00:28:03.318 { 00:28:03.318 "name": "BaseBdev2", 00:28:03.318 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:03.318 "is_configured": true, 00:28:03.318 "data_offset": 256, 00:28:03.318 "data_size": 7936 00:28:03.318 } 00:28:03.318 ] 00:28:03.318 }' 00:28:03.318 22:56:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:03.318 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.319 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.319 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.319 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.319 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.319 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.886 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.886 "name": "raid_bdev1", 00:28:03.886 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:03.886 "strip_size_kb": 0, 00:28:03.886 "state": "online", 00:28:03.886 "raid_level": "raid1", 00:28:03.886 "superblock": true, 00:28:03.886 "num_base_bdevs": 2, 00:28:03.886 "num_base_bdevs_discovered": 2, 00:28:03.886 "num_base_bdevs_operational": 2, 00:28:03.886 "base_bdevs_list": [ 00:28:03.886 { 00:28:03.886 "name": "spare", 00:28:03.886 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:03.886 "is_configured": true, 00:28:03.886 "data_offset": 256, 00:28:03.886 "data_size": 7936 00:28:03.886 }, 00:28:03.886 { 00:28:03.886 "name": "BaseBdev2", 00:28:03.886 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:03.886 "is_configured": true, 00:28:03.886 "data_offset": 256, 00:28:03.886 "data_size": 7936 00:28:03.886 } 00:28:03.886 ] 00:28:03.886 }' 00:28:03.886 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.886 22:56:48 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:04.452 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:04.710 [2024-07-15 22:56:49.370550] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:04.710 [2024-07-15 22:56:49.370577] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:04.710 [2024-07-15 22:56:49.370641] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:04.710 [2024-07-15 22:56:49.370698] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:04.710 [2024-07-15 22:56:49.370710] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd14070 name raid_bdev1, state offline 00:28:04.710 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.710 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:04.969 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:05.228 /dev/nbd0 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:05.228 1+0 records in 00:28:05.228 1+0 records out 00:28:05.228 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244064 s, 16.8 MB/s 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:05.228 22:56:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:05.486 /dev/nbd1 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:05.486 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:05.486 1+0 records in 00:28:05.486 1+0 records out 00:28:05.486 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000325902 s, 12.6 MB/s 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:05.487 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:05.744 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:06.001 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:06.002 22:56:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:06.260 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:06.519 [2024-07-15 22:56:51.324482] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:06.519 [2024-07-15 22:56:51.324528] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:06.519 [2024-07-15 22:56:51.324548] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd13500 00:28:06.519 [2024-07-15 22:56:51.324561] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:06.519 [2024-07-15 22:56:51.326208] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:06.519 [2024-07-15 22:56:51.326238] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:06.519 [2024-07-15 22:56:51.326320] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:06.519 [2024-07-15 22:56:51.326347] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:06.519 [2024-07-15 22:56:51.326448] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:06.519 spare 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.519 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.519 [2024-07-15 22:56:51.426765] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd12260 00:28:06.519 [2024-07-15 22:56:51.426785] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:06.519 [2024-07-15 22:56:51.426988] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd0d490 00:28:06.519 [2024-07-15 22:56:51.427142] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd12260 00:28:06.519 [2024-07-15 22:56:51.427152] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd12260 00:28:06.519 [2024-07-15 22:56:51.427256] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:06.776 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:06.776 "name": "raid_bdev1", 00:28:06.776 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:06.776 "strip_size_kb": 0, 00:28:06.776 "state": "online", 00:28:06.776 "raid_level": "raid1", 00:28:06.776 "superblock": true, 00:28:06.776 "num_base_bdevs": 2, 00:28:06.776 "num_base_bdevs_discovered": 2, 00:28:06.776 "num_base_bdevs_operational": 2, 00:28:06.776 "base_bdevs_list": [ 00:28:06.776 { 00:28:06.776 "name": "spare", 00:28:06.776 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:06.776 "is_configured": true, 00:28:06.776 "data_offset": 256, 00:28:06.776 "data_size": 7936 00:28:06.776 }, 00:28:06.776 { 00:28:06.776 "name": "BaseBdev2", 00:28:06.776 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:06.776 "is_configured": true, 00:28:06.776 "data_offset": 256, 00:28:06.776 "data_size": 7936 00:28:06.776 } 00:28:06.776 ] 00:28:06.776 }' 00:28:06.776 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:06.776 22:56:51 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:07.709 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:07.709 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:07.709 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:07.709 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:07.709 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:07.709 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.709 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.967 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:07.967 "name": "raid_bdev1", 00:28:07.967 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:07.967 "strip_size_kb": 0, 00:28:07.967 "state": "online", 00:28:07.967 "raid_level": "raid1", 00:28:07.967 "superblock": true, 00:28:07.967 "num_base_bdevs": 2, 00:28:07.967 "num_base_bdevs_discovered": 2, 00:28:07.967 "num_base_bdevs_operational": 2, 00:28:07.967 "base_bdevs_list": [ 00:28:07.967 { 00:28:07.967 "name": "spare", 00:28:07.967 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:07.967 "is_configured": true, 00:28:07.967 "data_offset": 256, 00:28:07.967 "data_size": 7936 00:28:07.967 }, 00:28:07.967 { 00:28:07.967 "name": "BaseBdev2", 00:28:07.967 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:07.967 "is_configured": true, 00:28:07.967 "data_offset": 256, 00:28:07.967 "data_size": 7936 00:28:07.967 } 00:28:07.967 ] 00:28:07.967 }' 00:28:07.967 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:07.967 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:07.967 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:07.967 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:07.967 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.967 22:56:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:08.534 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:08.534 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:08.797 [2024-07-15 22:56:53.618716] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.797 22:56:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.364 22:56:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:09.364 "name": "raid_bdev1", 00:28:09.364 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:09.364 "strip_size_kb": 0, 00:28:09.364 "state": "online", 00:28:09.364 "raid_level": "raid1", 00:28:09.364 "superblock": true, 00:28:09.364 "num_base_bdevs": 2, 00:28:09.364 "num_base_bdevs_discovered": 1, 00:28:09.364 "num_base_bdevs_operational": 1, 00:28:09.364 "base_bdevs_list": [ 00:28:09.364 { 00:28:09.364 "name": null, 00:28:09.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.364 "is_configured": false, 00:28:09.364 "data_offset": 256, 00:28:09.364 "data_size": 7936 00:28:09.364 }, 00:28:09.364 { 00:28:09.364 "name": "BaseBdev2", 00:28:09.364 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:09.364 "is_configured": true, 00:28:09.364 "data_offset": 256, 00:28:09.364 "data_size": 7936 00:28:09.364 } 00:28:09.364 ] 00:28:09.364 }' 00:28:09.364 22:56:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:09.364 22:56:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:09.930 22:56:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:10.498 [2024-07-15 22:56:55.259074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:10.499 [2024-07-15 22:56:55.259238] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:10.499 [2024-07-15 22:56:55.259256] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:10.499 [2024-07-15 22:56:55.259283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:10.499 [2024-07-15 22:56:55.264812] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd0d490 00:28:10.499 [2024-07-15 22:56:55.267210] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:10.499 22:56:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:11.436 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:11.436 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:11.436 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:11.436 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:11.436 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:11.436 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.436 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:12.004 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:12.004 "name": "raid_bdev1", 00:28:12.004 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:12.004 "strip_size_kb": 0, 00:28:12.004 "state": "online", 00:28:12.004 "raid_level": "raid1", 00:28:12.004 "superblock": true, 00:28:12.004 "num_base_bdevs": 2, 00:28:12.004 "num_base_bdevs_discovered": 2, 00:28:12.004 "num_base_bdevs_operational": 2, 00:28:12.004 "process": { 00:28:12.004 "type": "rebuild", 00:28:12.004 "target": "spare", 00:28:12.004 "progress": { 00:28:12.004 "blocks": 3840, 00:28:12.004 "percent": 48 00:28:12.004 } 00:28:12.004 }, 00:28:12.004 "base_bdevs_list": [ 00:28:12.004 { 00:28:12.004 "name": "spare", 00:28:12.004 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:12.004 "is_configured": true, 00:28:12.004 "data_offset": 256, 00:28:12.004 "data_size": 7936 00:28:12.004 }, 00:28:12.004 { 00:28:12.004 "name": "BaseBdev2", 00:28:12.004 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:12.004 "is_configured": true, 00:28:12.004 "data_offset": 256, 00:28:12.004 "data_size": 7936 00:28:12.004 } 00:28:12.004 ] 00:28:12.004 }' 00:28:12.004 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:12.004 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:12.004 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:12.004 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:12.004 22:56:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:12.263 [2024-07-15 22:56:57.122725] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:12.521 [2024-07-15 22:56:57.182108] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:12.521 [2024-07-15 22:56:57.182156] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:12.521 [2024-07-15 22:56:57.182171] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:12.521 [2024-07-15 22:56:57.182180] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.521 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.089 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.089 "name": "raid_bdev1", 00:28:13.089 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:13.089 "strip_size_kb": 0, 00:28:13.089 "state": "online", 00:28:13.089 "raid_level": "raid1", 00:28:13.089 "superblock": true, 00:28:13.089 "num_base_bdevs": 2, 00:28:13.089 "num_base_bdevs_discovered": 1, 00:28:13.089 "num_base_bdevs_operational": 1, 00:28:13.089 "base_bdevs_list": [ 00:28:13.089 { 00:28:13.089 "name": null, 00:28:13.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.089 "is_configured": false, 00:28:13.089 "data_offset": 256, 00:28:13.089 "data_size": 7936 00:28:13.089 }, 00:28:13.089 { 00:28:13.089 "name": "BaseBdev2", 00:28:13.089 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:13.089 "is_configured": true, 00:28:13.089 "data_offset": 256, 00:28:13.089 "data_size": 7936 00:28:13.089 } 00:28:13.089 ] 00:28:13.089 }' 00:28:13.089 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.089 22:56:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:13.657 22:56:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:13.916 [2024-07-15 22:56:58.791448] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:13.916 [2024-07-15 22:56:58.791501] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:13.916 [2024-07-15 22:56:58.791524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd13730 00:28:13.916 [2024-07-15 22:56:58.791537] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:13.916 [2024-07-15 22:56:58.791921] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:13.916 [2024-07-15 22:56:58.791950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:13.916 [2024-07-15 22:56:58.792036] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:13.916 [2024-07-15 22:56:58.792050] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:13.916 [2024-07-15 22:56:58.792069] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:13.916 [2024-07-15 22:56:58.792087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:13.916 [2024-07-15 22:56:58.797500] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd163d0 00:28:13.916 spare 00:28:13.916 [2024-07-15 22:56:58.798994] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:14.174 22:56:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:15.109 22:56:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:15.109 22:56:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:15.109 22:56:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:15.109 22:56:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:15.109 22:56:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:15.109 22:56:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.109 22:56:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.674 22:57:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:15.674 "name": "raid_bdev1", 00:28:15.674 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:15.674 "strip_size_kb": 0, 00:28:15.674 "state": "online", 00:28:15.674 "raid_level": "raid1", 00:28:15.674 "superblock": true, 00:28:15.674 "num_base_bdevs": 2, 00:28:15.674 "num_base_bdevs_discovered": 2, 00:28:15.674 "num_base_bdevs_operational": 2, 00:28:15.674 "process": { 00:28:15.674 "type": "rebuild", 00:28:15.674 "target": "spare", 00:28:15.674 "progress": { 00:28:15.674 "blocks": 3840, 00:28:15.674 "percent": 48 00:28:15.674 } 00:28:15.674 }, 00:28:15.674 "base_bdevs_list": [ 00:28:15.674 { 00:28:15.674 "name": "spare", 00:28:15.674 "uuid": "468feb2d-eff8-59d5-a33a-1a240d61b55d", 00:28:15.674 "is_configured": true, 00:28:15.674 "data_offset": 256, 00:28:15.674 "data_size": 7936 00:28:15.674 }, 00:28:15.674 { 00:28:15.674 "name": "BaseBdev2", 00:28:15.674 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:15.674 "is_configured": true, 00:28:15.674 "data_offset": 256, 00:28:15.674 "data_size": 7936 00:28:15.674 } 00:28:15.674 ] 00:28:15.674 }' 00:28:15.674 22:57:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:15.674 22:57:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:15.674 22:57:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:15.674 22:57:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:15.674 22:57:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:16.239 [2024-07-15 22:57:00.956182] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:16.239 [2024-07-15 22:57:01.016295] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:16.239 [2024-07-15 22:57:01.016344] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:16.239 [2024-07-15 22:57:01.016359] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:16.239 [2024-07-15 22:57:01.016368] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.240 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:16.497 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:16.497 "name": "raid_bdev1", 00:28:16.497 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:16.497 "strip_size_kb": 0, 00:28:16.497 "state": "online", 00:28:16.497 "raid_level": "raid1", 00:28:16.497 "superblock": true, 00:28:16.497 "num_base_bdevs": 2, 00:28:16.497 "num_base_bdevs_discovered": 1, 00:28:16.497 "num_base_bdevs_operational": 1, 00:28:16.497 "base_bdevs_list": [ 00:28:16.497 { 00:28:16.497 "name": null, 00:28:16.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:16.498 "is_configured": false, 00:28:16.498 "data_offset": 256, 00:28:16.498 "data_size": 7936 00:28:16.498 }, 00:28:16.498 { 00:28:16.498 "name": "BaseBdev2", 00:28:16.498 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:16.498 "is_configured": true, 00:28:16.498 "data_offset": 256, 00:28:16.498 "data_size": 7936 00:28:16.498 } 00:28:16.498 ] 00:28:16.498 }' 00:28:16.498 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:16.498 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:17.072 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:17.072 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:17.072 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:17.072 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:17.072 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:17.072 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.072 22:57:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.332 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:17.332 "name": "raid_bdev1", 00:28:17.332 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:17.332 "strip_size_kb": 0, 00:28:17.332 "state": "online", 00:28:17.332 "raid_level": "raid1", 00:28:17.332 "superblock": true, 00:28:17.332 "num_base_bdevs": 2, 00:28:17.332 "num_base_bdevs_discovered": 1, 00:28:17.332 "num_base_bdevs_operational": 1, 00:28:17.332 "base_bdevs_list": [ 00:28:17.332 { 00:28:17.332 "name": null, 00:28:17.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.332 "is_configured": false, 00:28:17.332 "data_offset": 256, 00:28:17.332 "data_size": 7936 00:28:17.332 }, 00:28:17.332 { 00:28:17.332 "name": "BaseBdev2", 00:28:17.332 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:17.332 "is_configured": true, 00:28:17.332 "data_offset": 256, 00:28:17.332 "data_size": 7936 00:28:17.332 } 00:28:17.332 ] 00:28:17.332 }' 00:28:17.332 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:17.332 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:17.332 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:17.332 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:17.332 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:17.591 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:18.159 [2024-07-15 22:57:02.958429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:18.159 [2024-07-15 22:57:02.958483] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:18.159 [2024-07-15 22:57:02.958505] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd0dda0 00:28:18.159 [2024-07-15 22:57:02.958517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:18.159 [2024-07-15 22:57:02.958876] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:18.159 [2024-07-15 22:57:02.958893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:18.159 [2024-07-15 22:57:02.958973] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:18.159 [2024-07-15 22:57:02.958987] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:18.159 [2024-07-15 22:57:02.958997] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:18.159 BaseBdev1 00:28:18.159 22:57:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.095 22:57:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.662 22:57:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.662 "name": "raid_bdev1", 00:28:19.662 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:19.662 "strip_size_kb": 0, 00:28:19.662 "state": "online", 00:28:19.662 "raid_level": "raid1", 00:28:19.662 "superblock": true, 00:28:19.662 "num_base_bdevs": 2, 00:28:19.662 "num_base_bdevs_discovered": 1, 00:28:19.662 "num_base_bdevs_operational": 1, 00:28:19.662 "base_bdevs_list": [ 00:28:19.662 { 00:28:19.662 "name": null, 00:28:19.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:19.662 "is_configured": false, 00:28:19.662 "data_offset": 256, 00:28:19.662 "data_size": 7936 00:28:19.662 }, 00:28:19.662 { 00:28:19.662 "name": "BaseBdev2", 00:28:19.662 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:19.662 "is_configured": true, 00:28:19.662 "data_offset": 256, 00:28:19.662 "data_size": 7936 00:28:19.662 } 00:28:19.662 ] 00:28:19.662 }' 00:28:19.662 22:57:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.662 22:57:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:20.231 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:20.231 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:20.231 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:20.231 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:20.231 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:20.231 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.231 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.490 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:20.490 "name": "raid_bdev1", 00:28:20.490 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:20.490 "strip_size_kb": 0, 00:28:20.490 "state": "online", 00:28:20.490 "raid_level": "raid1", 00:28:20.490 "superblock": true, 00:28:20.490 "num_base_bdevs": 2, 00:28:20.490 "num_base_bdevs_discovered": 1, 00:28:20.490 "num_base_bdevs_operational": 1, 00:28:20.490 "base_bdevs_list": [ 00:28:20.490 { 00:28:20.490 "name": null, 00:28:20.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.490 "is_configured": false, 00:28:20.490 "data_offset": 256, 00:28:20.490 "data_size": 7936 00:28:20.490 }, 00:28:20.490 { 00:28:20.490 "name": "BaseBdev2", 00:28:20.490 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:20.490 "is_configured": true, 00:28:20.490 "data_offset": 256, 00:28:20.490 "data_size": 7936 00:28:20.490 } 00:28:20.490 ] 00:28:20.490 }' 00:28:20.490 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:20.749 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:21.008 [2024-07-15 22:57:05.677701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:21.008 [2024-07-15 22:57:05.677827] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:21.008 [2024-07-15 22:57:05.677854] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:21.008 request: 00:28:21.008 { 00:28:21.008 "base_bdev": "BaseBdev1", 00:28:21.008 "raid_bdev": "raid_bdev1", 00:28:21.008 "method": "bdev_raid_add_base_bdev", 00:28:21.008 "req_id": 1 00:28:21.008 } 00:28:21.008 Got JSON-RPC error response 00:28:21.008 response: 00:28:21.008 { 00:28:21.008 "code": -22, 00:28:21.008 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:21.008 } 00:28:21.008 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:28:21.008 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:21.008 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:21.008 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:21.008 22:57:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.945 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.205 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:22.205 "name": "raid_bdev1", 00:28:22.205 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:22.205 "strip_size_kb": 0, 00:28:22.205 "state": "online", 00:28:22.205 "raid_level": "raid1", 00:28:22.205 "superblock": true, 00:28:22.205 "num_base_bdevs": 2, 00:28:22.205 "num_base_bdevs_discovered": 1, 00:28:22.205 "num_base_bdevs_operational": 1, 00:28:22.205 "base_bdevs_list": [ 00:28:22.205 { 00:28:22.205 "name": null, 00:28:22.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:22.205 "is_configured": false, 00:28:22.205 "data_offset": 256, 00:28:22.205 "data_size": 7936 00:28:22.205 }, 00:28:22.205 { 00:28:22.205 "name": "BaseBdev2", 00:28:22.205 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:22.205 "is_configured": true, 00:28:22.205 "data_offset": 256, 00:28:22.205 "data_size": 7936 00:28:22.205 } 00:28:22.205 ] 00:28:22.205 }' 00:28:22.205 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:22.205 22:57:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:22.773 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:22.773 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:22.773 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:22.773 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:22.773 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:22.773 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.773 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:23.032 "name": "raid_bdev1", 00:28:23.032 "uuid": "d0bf8156-7f86-41fa-82a7-a4375346fdbc", 00:28:23.032 "strip_size_kb": 0, 00:28:23.032 "state": "online", 00:28:23.032 "raid_level": "raid1", 00:28:23.032 "superblock": true, 00:28:23.032 "num_base_bdevs": 2, 00:28:23.032 "num_base_bdevs_discovered": 1, 00:28:23.032 "num_base_bdevs_operational": 1, 00:28:23.032 "base_bdevs_list": [ 00:28:23.032 { 00:28:23.032 "name": null, 00:28:23.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.032 "is_configured": false, 00:28:23.032 "data_offset": 256, 00:28:23.032 "data_size": 7936 00:28:23.032 }, 00:28:23.032 { 00:28:23.032 "name": "BaseBdev2", 00:28:23.032 "uuid": "e098bd5a-7fcc-5817-9241-aebe9967d12b", 00:28:23.032 "is_configured": true, 00:28:23.032 "data_offset": 256, 00:28:23.032 "data_size": 7936 00:28:23.032 } 00:28:23.032 ] 00:28:23.032 }' 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2844290 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2844290 ']' 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2844290 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2844290 00:28:23.032 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:23.033 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:23.033 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2844290' 00:28:23.033 killing process with pid 2844290 00:28:23.033 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2844290 00:28:23.033 Received shutdown signal, test time was about 60.000000 seconds 00:28:23.033 00:28:23.033 Latency(us) 00:28:23.033 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:23.033 =================================================================================================================== 00:28:23.033 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:23.033 [2024-07-15 22:57:07.928900] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:23.033 [2024-07-15 22:57:07.929005] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:23.033 [2024-07-15 22:57:07.929049] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:23.033 [2024-07-15 22:57:07.929063] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd12260 name raid_bdev1, state offline 00:28:23.033 22:57:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2844290 00:28:23.292 [2024-07-15 22:57:07.956637] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:23.292 22:57:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:28:23.292 00:28:23.292 real 0m35.146s 00:28:23.292 user 0m56.047s 00:28:23.292 sys 0m5.485s 00:28:23.292 22:57:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:23.292 22:57:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:23.292 ************************************ 00:28:23.292 END TEST raid_rebuild_test_sb_4k 00:28:23.292 ************************************ 00:28:23.552 22:57:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:23.552 22:57:08 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:28:23.552 22:57:08 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:28:23.552 22:57:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:23.552 22:57:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:23.552 22:57:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:23.552 ************************************ 00:28:23.552 START TEST raid_state_function_test_sb_md_separate 00:28:23.552 ************************************ 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2849809 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2849809' 00:28:23.552 Process raid pid: 2849809 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2849809 /var/tmp/spdk-raid.sock 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2849809 ']' 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:23.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:23.552 22:57:08 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:23.552 [2024-07-15 22:57:08.335918] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:28:23.552 [2024-07-15 22:57:08.335988] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:23.552 [2024-07-15 22:57:08.458175] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:23.811 [2024-07-15 22:57:08.559185] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:23.811 [2024-07-15 22:57:08.619474] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:23.811 [2024-07-15 22:57:08.619510] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:24.379 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:24.379 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:24.379 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:24.638 [2024-07-15 22:57:09.512287] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:24.638 [2024-07-15 22:57:09.512327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:24.638 [2024-07-15 22:57:09.512340] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:24.638 [2024-07-15 22:57:09.512352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.638 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:24.897 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:24.897 "name": "Existed_Raid", 00:28:24.897 "uuid": "f3447d1d-a87e-49e8-91f3-ac44b501ed4e", 00:28:24.897 "strip_size_kb": 0, 00:28:24.897 "state": "configuring", 00:28:24.897 "raid_level": "raid1", 00:28:24.897 "superblock": true, 00:28:24.897 "num_base_bdevs": 2, 00:28:24.897 "num_base_bdevs_discovered": 0, 00:28:24.897 "num_base_bdevs_operational": 2, 00:28:24.897 "base_bdevs_list": [ 00:28:24.897 { 00:28:24.897 "name": "BaseBdev1", 00:28:24.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.897 "is_configured": false, 00:28:24.897 "data_offset": 0, 00:28:24.897 "data_size": 0 00:28:24.897 }, 00:28:24.897 { 00:28:24.897 "name": "BaseBdev2", 00:28:24.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:24.897 "is_configured": false, 00:28:24.897 "data_offset": 0, 00:28:24.897 "data_size": 0 00:28:24.897 } 00:28:24.897 ] 00:28:24.897 }' 00:28:24.897 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:24.897 22:57:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:25.531 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:25.796 [2024-07-15 22:57:10.486753] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:25.796 [2024-07-15 22:57:10.486786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a47a80 name Existed_Raid, state configuring 00:28:25.796 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:25.796 [2024-07-15 22:57:10.659223] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:25.796 [2024-07-15 22:57:10.659252] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:25.796 [2024-07-15 22:57:10.659263] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:25.796 [2024-07-15 22:57:10.659274] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:25.796 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:28:26.055 [2024-07-15 22:57:10.918376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:26.055 BaseBdev1 00:28:26.055 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:26.055 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:26.055 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:26.055 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:26.055 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:26.055 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:26.055 22:57:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:26.313 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:26.572 [ 00:28:26.572 { 00:28:26.572 "name": "BaseBdev1", 00:28:26.572 "aliases": [ 00:28:26.572 "31566d59-d360-479f-b0fd-b882e4f13da6" 00:28:26.572 ], 00:28:26.572 "product_name": "Malloc disk", 00:28:26.572 "block_size": 4096, 00:28:26.572 "num_blocks": 8192, 00:28:26.572 "uuid": "31566d59-d360-479f-b0fd-b882e4f13da6", 00:28:26.572 "md_size": 32, 00:28:26.572 "md_interleave": false, 00:28:26.572 "dif_type": 0, 00:28:26.572 "assigned_rate_limits": { 00:28:26.572 "rw_ios_per_sec": 0, 00:28:26.572 "rw_mbytes_per_sec": 0, 00:28:26.572 "r_mbytes_per_sec": 0, 00:28:26.572 "w_mbytes_per_sec": 0 00:28:26.572 }, 00:28:26.572 "claimed": true, 00:28:26.572 "claim_type": "exclusive_write", 00:28:26.572 "zoned": false, 00:28:26.572 "supported_io_types": { 00:28:26.572 "read": true, 00:28:26.572 "write": true, 00:28:26.572 "unmap": true, 00:28:26.572 "flush": true, 00:28:26.572 "reset": true, 00:28:26.572 "nvme_admin": false, 00:28:26.572 "nvme_io": false, 00:28:26.572 "nvme_io_md": false, 00:28:26.572 "write_zeroes": true, 00:28:26.572 "zcopy": true, 00:28:26.572 "get_zone_info": false, 00:28:26.572 "zone_management": false, 00:28:26.572 "zone_append": false, 00:28:26.572 "compare": false, 00:28:26.572 "compare_and_write": false, 00:28:26.572 "abort": true, 00:28:26.572 "seek_hole": false, 00:28:26.572 "seek_data": false, 00:28:26.572 "copy": true, 00:28:26.572 "nvme_iov_md": false 00:28:26.572 }, 00:28:26.572 "memory_domains": [ 00:28:26.572 { 00:28:26.572 "dma_device_id": "system", 00:28:26.572 "dma_device_type": 1 00:28:26.572 }, 00:28:26.572 { 00:28:26.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:26.572 "dma_device_type": 2 00:28:26.572 } 00:28:26.572 ], 00:28:26.572 "driver_specific": {} 00:28:26.572 } 00:28:26.572 ] 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.572 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:26.829 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.829 "name": "Existed_Raid", 00:28:26.829 "uuid": "fe63ab2b-e836-4ba2-b401-54b1a5558744", 00:28:26.829 "strip_size_kb": 0, 00:28:26.829 "state": "configuring", 00:28:26.829 "raid_level": "raid1", 00:28:26.829 "superblock": true, 00:28:26.829 "num_base_bdevs": 2, 00:28:26.829 "num_base_bdevs_discovered": 1, 00:28:26.829 "num_base_bdevs_operational": 2, 00:28:26.829 "base_bdevs_list": [ 00:28:26.829 { 00:28:26.829 "name": "BaseBdev1", 00:28:26.829 "uuid": "31566d59-d360-479f-b0fd-b882e4f13da6", 00:28:26.829 "is_configured": true, 00:28:26.829 "data_offset": 256, 00:28:26.829 "data_size": 7936 00:28:26.829 }, 00:28:26.829 { 00:28:26.829 "name": "BaseBdev2", 00:28:26.829 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:26.829 "is_configured": false, 00:28:26.829 "data_offset": 0, 00:28:26.829 "data_size": 0 00:28:26.829 } 00:28:26.829 ] 00:28:26.829 }' 00:28:26.830 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.830 22:57:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:27.395 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:27.652 [2024-07-15 22:57:12.406338] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:27.652 [2024-07-15 22:57:12.406384] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a47350 name Existed_Raid, state configuring 00:28:27.652 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:27.910 [2024-07-15 22:57:12.651025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:27.910 [2024-07-15 22:57:12.652451] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:27.910 [2024-07-15 22:57:12.652486] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.910 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:28.168 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.168 "name": "Existed_Raid", 00:28:28.168 "uuid": "358d5b60-c61a-415b-b7fa-ba87d2df5d00", 00:28:28.168 "strip_size_kb": 0, 00:28:28.168 "state": "configuring", 00:28:28.168 "raid_level": "raid1", 00:28:28.168 "superblock": true, 00:28:28.168 "num_base_bdevs": 2, 00:28:28.168 "num_base_bdevs_discovered": 1, 00:28:28.168 "num_base_bdevs_operational": 2, 00:28:28.168 "base_bdevs_list": [ 00:28:28.168 { 00:28:28.168 "name": "BaseBdev1", 00:28:28.168 "uuid": "31566d59-d360-479f-b0fd-b882e4f13da6", 00:28:28.168 "is_configured": true, 00:28:28.168 "data_offset": 256, 00:28:28.168 "data_size": 7936 00:28:28.168 }, 00:28:28.168 { 00:28:28.168 "name": "BaseBdev2", 00:28:28.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.168 "is_configured": false, 00:28:28.168 "data_offset": 0, 00:28:28.168 "data_size": 0 00:28:28.168 } 00:28:28.168 ] 00:28:28.168 }' 00:28:28.168 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.168 22:57:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:28.734 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:28:28.993 [2024-07-15 22:57:13.827403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:28.993 [2024-07-15 22:57:13.827572] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a49210 00:28:28.993 [2024-07-15 22:57:13.827586] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:28.993 [2024-07-15 22:57:13.827649] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a48c50 00:28:28.993 [2024-07-15 22:57:13.827750] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a49210 00:28:28.993 [2024-07-15 22:57:13.827760] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a49210 00:28:28.993 [2024-07-15 22:57:13.827833] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:28.993 BaseBdev2 00:28:28.993 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:28.993 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:28.993 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:28.993 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:28.993 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:28.993 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:28.993 22:57:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:29.251 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:29.509 [ 00:28:29.509 { 00:28:29.509 "name": "BaseBdev2", 00:28:29.509 "aliases": [ 00:28:29.509 "764549e5-3b2c-4665-9380-c1ea66f623cd" 00:28:29.509 ], 00:28:29.509 "product_name": "Malloc disk", 00:28:29.509 "block_size": 4096, 00:28:29.509 "num_blocks": 8192, 00:28:29.509 "uuid": "764549e5-3b2c-4665-9380-c1ea66f623cd", 00:28:29.509 "md_size": 32, 00:28:29.509 "md_interleave": false, 00:28:29.509 "dif_type": 0, 00:28:29.509 "assigned_rate_limits": { 00:28:29.509 "rw_ios_per_sec": 0, 00:28:29.509 "rw_mbytes_per_sec": 0, 00:28:29.509 "r_mbytes_per_sec": 0, 00:28:29.509 "w_mbytes_per_sec": 0 00:28:29.509 }, 00:28:29.509 "claimed": true, 00:28:29.509 "claim_type": "exclusive_write", 00:28:29.509 "zoned": false, 00:28:29.509 "supported_io_types": { 00:28:29.509 "read": true, 00:28:29.509 "write": true, 00:28:29.509 "unmap": true, 00:28:29.509 "flush": true, 00:28:29.509 "reset": true, 00:28:29.509 "nvme_admin": false, 00:28:29.509 "nvme_io": false, 00:28:29.509 "nvme_io_md": false, 00:28:29.509 "write_zeroes": true, 00:28:29.509 "zcopy": true, 00:28:29.509 "get_zone_info": false, 00:28:29.509 "zone_management": false, 00:28:29.509 "zone_append": false, 00:28:29.509 "compare": false, 00:28:29.509 "compare_and_write": false, 00:28:29.509 "abort": true, 00:28:29.509 "seek_hole": false, 00:28:29.509 "seek_data": false, 00:28:29.509 "copy": true, 00:28:29.509 "nvme_iov_md": false 00:28:29.509 }, 00:28:29.509 "memory_domains": [ 00:28:29.509 { 00:28:29.509 "dma_device_id": "system", 00:28:29.509 "dma_device_type": 1 00:28:29.509 }, 00:28:29.509 { 00:28:29.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:29.509 "dma_device_type": 2 00:28:29.509 } 00:28:29.509 ], 00:28:29.509 "driver_specific": {} 00:28:29.509 } 00:28:29.509 ] 00:28:29.509 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:29.509 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:29.510 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:30.076 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:30.076 "name": "Existed_Raid", 00:28:30.076 "uuid": "358d5b60-c61a-415b-b7fa-ba87d2df5d00", 00:28:30.076 "strip_size_kb": 0, 00:28:30.076 "state": "online", 00:28:30.076 "raid_level": "raid1", 00:28:30.076 "superblock": true, 00:28:30.076 "num_base_bdevs": 2, 00:28:30.076 "num_base_bdevs_discovered": 2, 00:28:30.076 "num_base_bdevs_operational": 2, 00:28:30.076 "base_bdevs_list": [ 00:28:30.076 { 00:28:30.076 "name": "BaseBdev1", 00:28:30.076 "uuid": "31566d59-d360-479f-b0fd-b882e4f13da6", 00:28:30.076 "is_configured": true, 00:28:30.076 "data_offset": 256, 00:28:30.076 "data_size": 7936 00:28:30.076 }, 00:28:30.076 { 00:28:30.076 "name": "BaseBdev2", 00:28:30.076 "uuid": "764549e5-3b2c-4665-9380-c1ea66f623cd", 00:28:30.076 "is_configured": true, 00:28:30.076 "data_offset": 256, 00:28:30.076 "data_size": 7936 00:28:30.076 } 00:28:30.076 ] 00:28:30.076 }' 00:28:30.076 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:30.076 22:57:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:31.010 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:31.268 [2024-07-15 22:57:15.929304] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:31.268 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:31.268 "name": "Existed_Raid", 00:28:31.268 "aliases": [ 00:28:31.268 "358d5b60-c61a-415b-b7fa-ba87d2df5d00" 00:28:31.268 ], 00:28:31.268 "product_name": "Raid Volume", 00:28:31.268 "block_size": 4096, 00:28:31.268 "num_blocks": 7936, 00:28:31.268 "uuid": "358d5b60-c61a-415b-b7fa-ba87d2df5d00", 00:28:31.268 "md_size": 32, 00:28:31.268 "md_interleave": false, 00:28:31.268 "dif_type": 0, 00:28:31.268 "assigned_rate_limits": { 00:28:31.268 "rw_ios_per_sec": 0, 00:28:31.268 "rw_mbytes_per_sec": 0, 00:28:31.268 "r_mbytes_per_sec": 0, 00:28:31.268 "w_mbytes_per_sec": 0 00:28:31.268 }, 00:28:31.268 "claimed": false, 00:28:31.268 "zoned": false, 00:28:31.268 "supported_io_types": { 00:28:31.268 "read": true, 00:28:31.268 "write": true, 00:28:31.268 "unmap": false, 00:28:31.268 "flush": false, 00:28:31.268 "reset": true, 00:28:31.268 "nvme_admin": false, 00:28:31.268 "nvme_io": false, 00:28:31.268 "nvme_io_md": false, 00:28:31.268 "write_zeroes": true, 00:28:31.268 "zcopy": false, 00:28:31.268 "get_zone_info": false, 00:28:31.268 "zone_management": false, 00:28:31.268 "zone_append": false, 00:28:31.268 "compare": false, 00:28:31.268 "compare_and_write": false, 00:28:31.268 "abort": false, 00:28:31.268 "seek_hole": false, 00:28:31.268 "seek_data": false, 00:28:31.268 "copy": false, 00:28:31.268 "nvme_iov_md": false 00:28:31.268 }, 00:28:31.269 "memory_domains": [ 00:28:31.269 { 00:28:31.269 "dma_device_id": "system", 00:28:31.269 "dma_device_type": 1 00:28:31.269 }, 00:28:31.269 { 00:28:31.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:31.269 "dma_device_type": 2 00:28:31.269 }, 00:28:31.269 { 00:28:31.269 "dma_device_id": "system", 00:28:31.269 "dma_device_type": 1 00:28:31.269 }, 00:28:31.269 { 00:28:31.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:31.269 "dma_device_type": 2 00:28:31.269 } 00:28:31.269 ], 00:28:31.269 "driver_specific": { 00:28:31.269 "raid": { 00:28:31.269 "uuid": "358d5b60-c61a-415b-b7fa-ba87d2df5d00", 00:28:31.269 "strip_size_kb": 0, 00:28:31.269 "state": "online", 00:28:31.269 "raid_level": "raid1", 00:28:31.269 "superblock": true, 00:28:31.269 "num_base_bdevs": 2, 00:28:31.269 "num_base_bdevs_discovered": 2, 00:28:31.269 "num_base_bdevs_operational": 2, 00:28:31.269 "base_bdevs_list": [ 00:28:31.269 { 00:28:31.269 "name": "BaseBdev1", 00:28:31.269 "uuid": "31566d59-d360-479f-b0fd-b882e4f13da6", 00:28:31.269 "is_configured": true, 00:28:31.269 "data_offset": 256, 00:28:31.269 "data_size": 7936 00:28:31.269 }, 00:28:31.269 { 00:28:31.269 "name": "BaseBdev2", 00:28:31.269 "uuid": "764549e5-3b2c-4665-9380-c1ea66f623cd", 00:28:31.269 "is_configured": true, 00:28:31.269 "data_offset": 256, 00:28:31.269 "data_size": 7936 00:28:31.269 } 00:28:31.269 ] 00:28:31.269 } 00:28:31.269 } 00:28:31.269 }' 00:28:31.269 22:57:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:31.269 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:31.269 BaseBdev2' 00:28:31.269 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:31.269 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:31.269 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:31.526 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:31.526 "name": "BaseBdev1", 00:28:31.526 "aliases": [ 00:28:31.526 "31566d59-d360-479f-b0fd-b882e4f13da6" 00:28:31.526 ], 00:28:31.526 "product_name": "Malloc disk", 00:28:31.526 "block_size": 4096, 00:28:31.526 "num_blocks": 8192, 00:28:31.526 "uuid": "31566d59-d360-479f-b0fd-b882e4f13da6", 00:28:31.526 "md_size": 32, 00:28:31.526 "md_interleave": false, 00:28:31.526 "dif_type": 0, 00:28:31.526 "assigned_rate_limits": { 00:28:31.526 "rw_ios_per_sec": 0, 00:28:31.526 "rw_mbytes_per_sec": 0, 00:28:31.526 "r_mbytes_per_sec": 0, 00:28:31.526 "w_mbytes_per_sec": 0 00:28:31.526 }, 00:28:31.526 "claimed": true, 00:28:31.526 "claim_type": "exclusive_write", 00:28:31.526 "zoned": false, 00:28:31.526 "supported_io_types": { 00:28:31.526 "read": true, 00:28:31.526 "write": true, 00:28:31.526 "unmap": true, 00:28:31.526 "flush": true, 00:28:31.526 "reset": true, 00:28:31.526 "nvme_admin": false, 00:28:31.526 "nvme_io": false, 00:28:31.526 "nvme_io_md": false, 00:28:31.526 "write_zeroes": true, 00:28:31.526 "zcopy": true, 00:28:31.526 "get_zone_info": false, 00:28:31.526 "zone_management": false, 00:28:31.526 "zone_append": false, 00:28:31.526 "compare": false, 00:28:31.526 "compare_and_write": false, 00:28:31.526 "abort": true, 00:28:31.526 "seek_hole": false, 00:28:31.526 "seek_data": false, 00:28:31.526 "copy": true, 00:28:31.526 "nvme_iov_md": false 00:28:31.526 }, 00:28:31.526 "memory_domains": [ 00:28:31.526 { 00:28:31.526 "dma_device_id": "system", 00:28:31.526 "dma_device_type": 1 00:28:31.526 }, 00:28:31.526 { 00:28:31.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:31.526 "dma_device_type": 2 00:28:31.526 } 00:28:31.526 ], 00:28:31.526 "driver_specific": {} 00:28:31.526 }' 00:28:31.526 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:31.526 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:31.526 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:31.526 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:31.783 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:31.783 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:31.783 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:31.783 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:31.783 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:31.783 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:32.041 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:32.041 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:32.041 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:32.041 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:32.041 22:57:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:32.607 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:32.607 "name": "BaseBdev2", 00:28:32.607 "aliases": [ 00:28:32.607 "764549e5-3b2c-4665-9380-c1ea66f623cd" 00:28:32.607 ], 00:28:32.607 "product_name": "Malloc disk", 00:28:32.607 "block_size": 4096, 00:28:32.607 "num_blocks": 8192, 00:28:32.607 "uuid": "764549e5-3b2c-4665-9380-c1ea66f623cd", 00:28:32.607 "md_size": 32, 00:28:32.607 "md_interleave": false, 00:28:32.607 "dif_type": 0, 00:28:32.607 "assigned_rate_limits": { 00:28:32.607 "rw_ios_per_sec": 0, 00:28:32.607 "rw_mbytes_per_sec": 0, 00:28:32.607 "r_mbytes_per_sec": 0, 00:28:32.607 "w_mbytes_per_sec": 0 00:28:32.607 }, 00:28:32.607 "claimed": true, 00:28:32.607 "claim_type": "exclusive_write", 00:28:32.607 "zoned": false, 00:28:32.607 "supported_io_types": { 00:28:32.607 "read": true, 00:28:32.607 "write": true, 00:28:32.607 "unmap": true, 00:28:32.607 "flush": true, 00:28:32.607 "reset": true, 00:28:32.607 "nvme_admin": false, 00:28:32.607 "nvme_io": false, 00:28:32.607 "nvme_io_md": false, 00:28:32.607 "write_zeroes": true, 00:28:32.607 "zcopy": true, 00:28:32.607 "get_zone_info": false, 00:28:32.607 "zone_management": false, 00:28:32.607 "zone_append": false, 00:28:32.607 "compare": false, 00:28:32.607 "compare_and_write": false, 00:28:32.607 "abort": true, 00:28:32.607 "seek_hole": false, 00:28:32.607 "seek_data": false, 00:28:32.607 "copy": true, 00:28:32.607 "nvme_iov_md": false 00:28:32.607 }, 00:28:32.607 "memory_domains": [ 00:28:32.607 { 00:28:32.607 "dma_device_id": "system", 00:28:32.607 "dma_device_type": 1 00:28:32.607 }, 00:28:32.607 { 00:28:32.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:32.607 "dma_device_type": 2 00:28:32.607 } 00:28:32.607 ], 00:28:32.607 "driver_specific": {} 00:28:32.607 }' 00:28:32.607 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:32.607 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:32.607 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:32.607 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:32.607 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:32.866 22:57:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:33.433 [2024-07-15 22:57:18.179100] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.433 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:33.999 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.999 "name": "Existed_Raid", 00:28:33.999 "uuid": "358d5b60-c61a-415b-b7fa-ba87d2df5d00", 00:28:33.999 "strip_size_kb": 0, 00:28:33.999 "state": "online", 00:28:33.999 "raid_level": "raid1", 00:28:33.999 "superblock": true, 00:28:33.999 "num_base_bdevs": 2, 00:28:33.999 "num_base_bdevs_discovered": 1, 00:28:33.999 "num_base_bdevs_operational": 1, 00:28:33.999 "base_bdevs_list": [ 00:28:33.999 { 00:28:33.999 "name": null, 00:28:33.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:33.999 "is_configured": false, 00:28:33.999 "data_offset": 256, 00:28:33.999 "data_size": 7936 00:28:33.999 }, 00:28:33.999 { 00:28:33.999 "name": "BaseBdev2", 00:28:33.999 "uuid": "764549e5-3b2c-4665-9380-c1ea66f623cd", 00:28:33.999 "is_configured": true, 00:28:33.999 "data_offset": 256, 00:28:33.999 "data_size": 7936 00:28:33.999 } 00:28:33.999 ] 00:28:33.999 }' 00:28:33.999 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.999 22:57:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:34.937 22:57:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:34.937 22:57:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:34.937 22:57:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:34.937 22:57:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.197 22:57:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:35.197 22:57:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:35.197 22:57:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:35.460 [2024-07-15 22:57:20.349469] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:35.460 [2024-07-15 22:57:20.349570] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:35.460 [2024-07-15 22:57:20.363149] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:35.460 [2024-07-15 22:57:20.363185] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:35.460 [2024-07-15 22:57:20.363198] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a49210 name Existed_Raid, state offline 00:28:35.719 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:35.719 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:35.719 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.719 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2849809 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2849809 ']' 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2849809 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2849809 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2849809' 00:28:35.977 killing process with pid 2849809 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2849809 00:28:35.977 [2024-07-15 22:57:20.693569] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:35.977 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2849809 00:28:35.977 [2024-07-15 22:57:20.694556] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:36.237 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:36.237 00:28:36.237 real 0m12.657s 00:28:36.237 user 0m22.655s 00:28:36.237 sys 0m2.237s 00:28:36.237 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:36.237 22:57:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:36.237 ************************************ 00:28:36.237 END TEST raid_state_function_test_sb_md_separate 00:28:36.237 ************************************ 00:28:36.237 22:57:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:36.237 22:57:20 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:36.237 22:57:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:36.237 22:57:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:36.237 22:57:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:36.237 ************************************ 00:28:36.237 START TEST raid_superblock_test_md_separate 00:28:36.237 ************************************ 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2851615 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2851615 /var/tmp/spdk-raid.sock 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2851615 ']' 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:36.237 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:36.237 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:36.237 [2024-07-15 22:57:21.076250] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:28:36.237 [2024-07-15 22:57:21.076327] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2851615 ] 00:28:36.496 [2024-07-15 22:57:21.207997] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:36.496 [2024-07-15 22:57:21.309101] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:36.496 [2024-07-15 22:57:21.370163] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:36.496 [2024-07-15 22:57:21.370198] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:37.065 22:57:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:37.324 malloc1 00:28:37.324 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:37.583 [2024-07-15 22:57:22.281494] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:37.583 [2024-07-15 22:57:22.281543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:37.583 [2024-07-15 22:57:22.281562] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180b830 00:28:37.583 [2024-07-15 22:57:22.281575] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:37.583 [2024-07-15 22:57:22.282993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:37.583 [2024-07-15 22:57:22.283020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:37.583 pt1 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:37.583 malloc2 00:28:37.583 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:37.842 [2024-07-15 22:57:22.708041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:37.842 [2024-07-15 22:57:22.708087] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:37.842 [2024-07-15 22:57:22.708106] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17fd250 00:28:37.842 [2024-07-15 22:57:22.708119] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:37.842 [2024-07-15 22:57:22.709368] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:37.842 [2024-07-15 22:57:22.709394] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:37.842 pt2 00:28:37.842 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:37.842 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:37.842 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:38.101 [2024-07-15 22:57:22.956710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:38.101 [2024-07-15 22:57:22.957913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:38.101 [2024-07-15 22:57:22.958060] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17fdd20 00:28:38.101 [2024-07-15 22:57:22.958073] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:38.101 [2024-07-15 22:57:22.958135] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f1a60 00:28:38.101 [2024-07-15 22:57:22.958243] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17fdd20 00:28:38.101 [2024-07-15 22:57:22.958253] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17fdd20 00:28:38.101 [2024-07-15 22:57:22.958318] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.101 22:57:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.360 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:38.360 "name": "raid_bdev1", 00:28:38.360 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:38.360 "strip_size_kb": 0, 00:28:38.360 "state": "online", 00:28:38.360 "raid_level": "raid1", 00:28:38.360 "superblock": true, 00:28:38.360 "num_base_bdevs": 2, 00:28:38.360 "num_base_bdevs_discovered": 2, 00:28:38.360 "num_base_bdevs_operational": 2, 00:28:38.360 "base_bdevs_list": [ 00:28:38.360 { 00:28:38.360 "name": "pt1", 00:28:38.360 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:38.360 "is_configured": true, 00:28:38.360 "data_offset": 256, 00:28:38.360 "data_size": 7936 00:28:38.360 }, 00:28:38.360 { 00:28:38.360 "name": "pt2", 00:28:38.360 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:38.360 "is_configured": true, 00:28:38.360 "data_offset": 256, 00:28:38.360 "data_size": 7936 00:28:38.360 } 00:28:38.360 ] 00:28:38.360 }' 00:28:38.360 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:38.360 22:57:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:39.297 22:57:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:39.297 [2024-07-15 22:57:24.063882] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:39.297 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:39.297 "name": "raid_bdev1", 00:28:39.297 "aliases": [ 00:28:39.297 "ec82a565-351c-497e-9eb7-458dd6f57e28" 00:28:39.297 ], 00:28:39.297 "product_name": "Raid Volume", 00:28:39.297 "block_size": 4096, 00:28:39.297 "num_blocks": 7936, 00:28:39.297 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:39.297 "md_size": 32, 00:28:39.297 "md_interleave": false, 00:28:39.297 "dif_type": 0, 00:28:39.297 "assigned_rate_limits": { 00:28:39.297 "rw_ios_per_sec": 0, 00:28:39.297 "rw_mbytes_per_sec": 0, 00:28:39.297 "r_mbytes_per_sec": 0, 00:28:39.297 "w_mbytes_per_sec": 0 00:28:39.297 }, 00:28:39.297 "claimed": false, 00:28:39.297 "zoned": false, 00:28:39.297 "supported_io_types": { 00:28:39.297 "read": true, 00:28:39.297 "write": true, 00:28:39.297 "unmap": false, 00:28:39.297 "flush": false, 00:28:39.297 "reset": true, 00:28:39.297 "nvme_admin": false, 00:28:39.297 "nvme_io": false, 00:28:39.297 "nvme_io_md": false, 00:28:39.297 "write_zeroes": true, 00:28:39.297 "zcopy": false, 00:28:39.297 "get_zone_info": false, 00:28:39.297 "zone_management": false, 00:28:39.297 "zone_append": false, 00:28:39.297 "compare": false, 00:28:39.297 "compare_and_write": false, 00:28:39.297 "abort": false, 00:28:39.297 "seek_hole": false, 00:28:39.297 "seek_data": false, 00:28:39.297 "copy": false, 00:28:39.297 "nvme_iov_md": false 00:28:39.297 }, 00:28:39.297 "memory_domains": [ 00:28:39.297 { 00:28:39.297 "dma_device_id": "system", 00:28:39.297 "dma_device_type": 1 00:28:39.297 }, 00:28:39.297 { 00:28:39.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.297 "dma_device_type": 2 00:28:39.297 }, 00:28:39.297 { 00:28:39.297 "dma_device_id": "system", 00:28:39.297 "dma_device_type": 1 00:28:39.297 }, 00:28:39.297 { 00:28:39.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.297 "dma_device_type": 2 00:28:39.297 } 00:28:39.297 ], 00:28:39.297 "driver_specific": { 00:28:39.297 "raid": { 00:28:39.297 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:39.297 "strip_size_kb": 0, 00:28:39.297 "state": "online", 00:28:39.297 "raid_level": "raid1", 00:28:39.297 "superblock": true, 00:28:39.297 "num_base_bdevs": 2, 00:28:39.297 "num_base_bdevs_discovered": 2, 00:28:39.297 "num_base_bdevs_operational": 2, 00:28:39.297 "base_bdevs_list": [ 00:28:39.297 { 00:28:39.297 "name": "pt1", 00:28:39.297 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.297 "is_configured": true, 00:28:39.297 "data_offset": 256, 00:28:39.297 "data_size": 7936 00:28:39.297 }, 00:28:39.297 { 00:28:39.297 "name": "pt2", 00:28:39.297 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:39.297 "is_configured": true, 00:28:39.297 "data_offset": 256, 00:28:39.297 "data_size": 7936 00:28:39.297 } 00:28:39.297 ] 00:28:39.297 } 00:28:39.297 } 00:28:39.297 }' 00:28:39.297 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:39.297 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:39.297 pt2' 00:28:39.297 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:39.297 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:39.297 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:39.556 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:39.556 "name": "pt1", 00:28:39.556 "aliases": [ 00:28:39.556 "00000000-0000-0000-0000-000000000001" 00:28:39.556 ], 00:28:39.556 "product_name": "passthru", 00:28:39.556 "block_size": 4096, 00:28:39.556 "num_blocks": 8192, 00:28:39.556 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.556 "md_size": 32, 00:28:39.556 "md_interleave": false, 00:28:39.556 "dif_type": 0, 00:28:39.556 "assigned_rate_limits": { 00:28:39.556 "rw_ios_per_sec": 0, 00:28:39.556 "rw_mbytes_per_sec": 0, 00:28:39.556 "r_mbytes_per_sec": 0, 00:28:39.556 "w_mbytes_per_sec": 0 00:28:39.556 }, 00:28:39.556 "claimed": true, 00:28:39.556 "claim_type": "exclusive_write", 00:28:39.556 "zoned": false, 00:28:39.556 "supported_io_types": { 00:28:39.556 "read": true, 00:28:39.556 "write": true, 00:28:39.556 "unmap": true, 00:28:39.556 "flush": true, 00:28:39.556 "reset": true, 00:28:39.556 "nvme_admin": false, 00:28:39.556 "nvme_io": false, 00:28:39.556 "nvme_io_md": false, 00:28:39.556 "write_zeroes": true, 00:28:39.556 "zcopy": true, 00:28:39.556 "get_zone_info": false, 00:28:39.556 "zone_management": false, 00:28:39.556 "zone_append": false, 00:28:39.556 "compare": false, 00:28:39.556 "compare_and_write": false, 00:28:39.556 "abort": true, 00:28:39.556 "seek_hole": false, 00:28:39.556 "seek_data": false, 00:28:39.556 "copy": true, 00:28:39.556 "nvme_iov_md": false 00:28:39.556 }, 00:28:39.556 "memory_domains": [ 00:28:39.556 { 00:28:39.556 "dma_device_id": "system", 00:28:39.556 "dma_device_type": 1 00:28:39.556 }, 00:28:39.556 { 00:28:39.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:39.556 "dma_device_type": 2 00:28:39.556 } 00:28:39.556 ], 00:28:39.556 "driver_specific": { 00:28:39.556 "passthru": { 00:28:39.556 "name": "pt1", 00:28:39.556 "base_bdev_name": "malloc1" 00:28:39.556 } 00:28:39.556 } 00:28:39.556 }' 00:28:39.556 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.556 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:39.884 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:40.143 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:40.143 "name": "pt2", 00:28:40.143 "aliases": [ 00:28:40.143 "00000000-0000-0000-0000-000000000002" 00:28:40.143 ], 00:28:40.143 "product_name": "passthru", 00:28:40.143 "block_size": 4096, 00:28:40.143 "num_blocks": 8192, 00:28:40.143 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:40.143 "md_size": 32, 00:28:40.143 "md_interleave": false, 00:28:40.143 "dif_type": 0, 00:28:40.143 "assigned_rate_limits": { 00:28:40.143 "rw_ios_per_sec": 0, 00:28:40.143 "rw_mbytes_per_sec": 0, 00:28:40.143 "r_mbytes_per_sec": 0, 00:28:40.143 "w_mbytes_per_sec": 0 00:28:40.143 }, 00:28:40.143 "claimed": true, 00:28:40.143 "claim_type": "exclusive_write", 00:28:40.143 "zoned": false, 00:28:40.143 "supported_io_types": { 00:28:40.143 "read": true, 00:28:40.143 "write": true, 00:28:40.143 "unmap": true, 00:28:40.143 "flush": true, 00:28:40.143 "reset": true, 00:28:40.143 "nvme_admin": false, 00:28:40.143 "nvme_io": false, 00:28:40.143 "nvme_io_md": false, 00:28:40.143 "write_zeroes": true, 00:28:40.143 "zcopy": true, 00:28:40.143 "get_zone_info": false, 00:28:40.143 "zone_management": false, 00:28:40.143 "zone_append": false, 00:28:40.143 "compare": false, 00:28:40.143 "compare_and_write": false, 00:28:40.143 "abort": true, 00:28:40.143 "seek_hole": false, 00:28:40.143 "seek_data": false, 00:28:40.143 "copy": true, 00:28:40.143 "nvme_iov_md": false 00:28:40.143 }, 00:28:40.143 "memory_domains": [ 00:28:40.143 { 00:28:40.143 "dma_device_id": "system", 00:28:40.143 "dma_device_type": 1 00:28:40.143 }, 00:28:40.143 { 00:28:40.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:40.143 "dma_device_type": 2 00:28:40.143 } 00:28:40.143 ], 00:28:40.143 "driver_specific": { 00:28:40.143 "passthru": { 00:28:40.143 "name": "pt2", 00:28:40.143 "base_bdev_name": "malloc2" 00:28:40.143 } 00:28:40.143 } 00:28:40.143 }' 00:28:40.143 22:57:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:40.143 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.402 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:40.661 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:40.661 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:40.661 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:40.661 [2024-07-15 22:57:25.555897] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:40.920 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ec82a565-351c-497e-9eb7-458dd6f57e28 00:28:40.920 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z ec82a565-351c-497e-9eb7-458dd6f57e28 ']' 00:28:40.920 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:40.920 [2024-07-15 22:57:25.796267] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:40.920 [2024-07-15 22:57:25.796286] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:40.920 [2024-07-15 22:57:25.796341] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:40.920 [2024-07-15 22:57:25.796395] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:40.920 [2024-07-15 22:57:25.796407] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17fdd20 name raid_bdev1, state offline 00:28:40.920 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.920 22:57:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:41.179 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:41.179 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:41.179 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:41.179 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:41.438 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:41.438 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:42.006 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:42.006 22:57:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:42.265 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:42.266 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:42.266 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:42.525 [2024-07-15 22:57:27.288168] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:42.525 [2024-07-15 22:57:27.289578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:42.525 [2024-07-15 22:57:27.289641] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:42.525 [2024-07-15 22:57:27.289685] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:42.525 [2024-07-15 22:57:27.289703] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:42.525 [2024-07-15 22:57:27.289713] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x166ded0 name raid_bdev1, state configuring 00:28:42.525 request: 00:28:42.525 { 00:28:42.525 "name": "raid_bdev1", 00:28:42.525 "raid_level": "raid1", 00:28:42.525 "base_bdevs": [ 00:28:42.525 "malloc1", 00:28:42.525 "malloc2" 00:28:42.525 ], 00:28:42.525 "superblock": false, 00:28:42.525 "method": "bdev_raid_create", 00:28:42.525 "req_id": 1 00:28:42.525 } 00:28:42.525 Got JSON-RPC error response 00:28:42.525 response: 00:28:42.525 { 00:28:42.525 "code": -17, 00:28:42.525 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:42.525 } 00:28:42.525 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:42.525 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:42.525 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:42.525 22:57:27 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:42.525 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:42.525 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:42.784 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:42.784 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:42.784 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:43.043 [2024-07-15 22:57:27.773380] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:43.043 [2024-07-15 22:57:27.773423] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:43.043 [2024-07-15 22:57:27.773442] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x180bee0 00:28:43.043 [2024-07-15 22:57:27.773455] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:43.043 [2024-07-15 22:57:27.774901] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:43.043 [2024-07-15 22:57:27.774936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:43.043 [2024-07-15 22:57:27.774985] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:43.043 [2024-07-15 22:57:27.775012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:43.043 pt1 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.043 22:57:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.302 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.302 "name": "raid_bdev1", 00:28:43.302 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:43.302 "strip_size_kb": 0, 00:28:43.302 "state": "configuring", 00:28:43.302 "raid_level": "raid1", 00:28:43.302 "superblock": true, 00:28:43.302 "num_base_bdevs": 2, 00:28:43.302 "num_base_bdevs_discovered": 1, 00:28:43.302 "num_base_bdevs_operational": 2, 00:28:43.302 "base_bdevs_list": [ 00:28:43.302 { 00:28:43.302 "name": "pt1", 00:28:43.302 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:43.302 "is_configured": true, 00:28:43.302 "data_offset": 256, 00:28:43.302 "data_size": 7936 00:28:43.302 }, 00:28:43.302 { 00:28:43.302 "name": null, 00:28:43.302 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.302 "is_configured": false, 00:28:43.302 "data_offset": 256, 00:28:43.302 "data_size": 7936 00:28:43.302 } 00:28:43.302 ] 00:28:43.302 }' 00:28:43.302 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.302 22:57:28 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:43.870 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:43.870 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:43.870 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:43.870 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:44.130 [2024-07-15 22:57:28.864297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:44.130 [2024-07-15 22:57:28.864354] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:44.130 [2024-07-15 22:57:28.864374] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x166e490 00:28:44.130 [2024-07-15 22:57:28.864387] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:44.130 [2024-07-15 22:57:28.864604] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:44.130 [2024-07-15 22:57:28.864621] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:44.130 [2024-07-15 22:57:28.864671] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:44.130 [2024-07-15 22:57:28.864691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:44.130 [2024-07-15 22:57:28.864786] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f25d0 00:28:44.130 [2024-07-15 22:57:28.864797] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:44.130 [2024-07-15 22:57:28.864851] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f3800 00:28:44.130 [2024-07-15 22:57:28.864964] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f25d0 00:28:44.130 [2024-07-15 22:57:28.864975] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17f25d0 00:28:44.130 [2024-07-15 22:57:28.865048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:44.130 pt2 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.130 22:57:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:44.389 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:44.389 "name": "raid_bdev1", 00:28:44.389 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:44.389 "strip_size_kb": 0, 00:28:44.389 "state": "online", 00:28:44.389 "raid_level": "raid1", 00:28:44.389 "superblock": true, 00:28:44.389 "num_base_bdevs": 2, 00:28:44.389 "num_base_bdevs_discovered": 2, 00:28:44.389 "num_base_bdevs_operational": 2, 00:28:44.389 "base_bdevs_list": [ 00:28:44.389 { 00:28:44.389 "name": "pt1", 00:28:44.389 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:44.389 "is_configured": true, 00:28:44.389 "data_offset": 256, 00:28:44.389 "data_size": 7936 00:28:44.389 }, 00:28:44.389 { 00:28:44.389 "name": "pt2", 00:28:44.389 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:44.389 "is_configured": true, 00:28:44.389 "data_offset": 256, 00:28:44.389 "data_size": 7936 00:28:44.389 } 00:28:44.389 ] 00:28:44.389 }' 00:28:44.389 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:44.389 22:57:29 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:44.956 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:44.956 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:44.956 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:44.956 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:44.957 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:44.957 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:44.957 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:44.957 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:45.216 [2024-07-15 22:57:29.959464] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:45.216 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:45.216 "name": "raid_bdev1", 00:28:45.216 "aliases": [ 00:28:45.216 "ec82a565-351c-497e-9eb7-458dd6f57e28" 00:28:45.216 ], 00:28:45.216 "product_name": "Raid Volume", 00:28:45.216 "block_size": 4096, 00:28:45.216 "num_blocks": 7936, 00:28:45.216 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:45.216 "md_size": 32, 00:28:45.216 "md_interleave": false, 00:28:45.216 "dif_type": 0, 00:28:45.216 "assigned_rate_limits": { 00:28:45.216 "rw_ios_per_sec": 0, 00:28:45.216 "rw_mbytes_per_sec": 0, 00:28:45.216 "r_mbytes_per_sec": 0, 00:28:45.216 "w_mbytes_per_sec": 0 00:28:45.216 }, 00:28:45.216 "claimed": false, 00:28:45.216 "zoned": false, 00:28:45.216 "supported_io_types": { 00:28:45.216 "read": true, 00:28:45.216 "write": true, 00:28:45.216 "unmap": false, 00:28:45.216 "flush": false, 00:28:45.216 "reset": true, 00:28:45.216 "nvme_admin": false, 00:28:45.216 "nvme_io": false, 00:28:45.216 "nvme_io_md": false, 00:28:45.216 "write_zeroes": true, 00:28:45.216 "zcopy": false, 00:28:45.216 "get_zone_info": false, 00:28:45.216 "zone_management": false, 00:28:45.216 "zone_append": false, 00:28:45.216 "compare": false, 00:28:45.216 "compare_and_write": false, 00:28:45.216 "abort": false, 00:28:45.216 "seek_hole": false, 00:28:45.216 "seek_data": false, 00:28:45.216 "copy": false, 00:28:45.216 "nvme_iov_md": false 00:28:45.216 }, 00:28:45.216 "memory_domains": [ 00:28:45.216 { 00:28:45.216 "dma_device_id": "system", 00:28:45.216 "dma_device_type": 1 00:28:45.216 }, 00:28:45.216 { 00:28:45.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:45.216 "dma_device_type": 2 00:28:45.216 }, 00:28:45.216 { 00:28:45.216 "dma_device_id": "system", 00:28:45.216 "dma_device_type": 1 00:28:45.216 }, 00:28:45.216 { 00:28:45.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:45.216 "dma_device_type": 2 00:28:45.216 } 00:28:45.216 ], 00:28:45.216 "driver_specific": { 00:28:45.216 "raid": { 00:28:45.216 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:45.216 "strip_size_kb": 0, 00:28:45.216 "state": "online", 00:28:45.216 "raid_level": "raid1", 00:28:45.216 "superblock": true, 00:28:45.216 "num_base_bdevs": 2, 00:28:45.216 "num_base_bdevs_discovered": 2, 00:28:45.216 "num_base_bdevs_operational": 2, 00:28:45.216 "base_bdevs_list": [ 00:28:45.216 { 00:28:45.216 "name": "pt1", 00:28:45.216 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:45.216 "is_configured": true, 00:28:45.216 "data_offset": 256, 00:28:45.216 "data_size": 7936 00:28:45.216 }, 00:28:45.216 { 00:28:45.216 "name": "pt2", 00:28:45.216 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:45.216 "is_configured": true, 00:28:45.216 "data_offset": 256, 00:28:45.216 "data_size": 7936 00:28:45.216 } 00:28:45.216 ] 00:28:45.216 } 00:28:45.216 } 00:28:45.216 }' 00:28:45.216 22:57:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:45.216 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:45.216 pt2' 00:28:45.216 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:45.216 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:45.216 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:45.474 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:45.474 "name": "pt1", 00:28:45.474 "aliases": [ 00:28:45.474 "00000000-0000-0000-0000-000000000001" 00:28:45.474 ], 00:28:45.474 "product_name": "passthru", 00:28:45.474 "block_size": 4096, 00:28:45.474 "num_blocks": 8192, 00:28:45.474 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:45.474 "md_size": 32, 00:28:45.474 "md_interleave": false, 00:28:45.474 "dif_type": 0, 00:28:45.474 "assigned_rate_limits": { 00:28:45.474 "rw_ios_per_sec": 0, 00:28:45.474 "rw_mbytes_per_sec": 0, 00:28:45.474 "r_mbytes_per_sec": 0, 00:28:45.475 "w_mbytes_per_sec": 0 00:28:45.475 }, 00:28:45.475 "claimed": true, 00:28:45.475 "claim_type": "exclusive_write", 00:28:45.475 "zoned": false, 00:28:45.475 "supported_io_types": { 00:28:45.475 "read": true, 00:28:45.475 "write": true, 00:28:45.475 "unmap": true, 00:28:45.475 "flush": true, 00:28:45.475 "reset": true, 00:28:45.475 "nvme_admin": false, 00:28:45.475 "nvme_io": false, 00:28:45.475 "nvme_io_md": false, 00:28:45.475 "write_zeroes": true, 00:28:45.475 "zcopy": true, 00:28:45.475 "get_zone_info": false, 00:28:45.475 "zone_management": false, 00:28:45.475 "zone_append": false, 00:28:45.475 "compare": false, 00:28:45.475 "compare_and_write": false, 00:28:45.475 "abort": true, 00:28:45.475 "seek_hole": false, 00:28:45.475 "seek_data": false, 00:28:45.475 "copy": true, 00:28:45.475 "nvme_iov_md": false 00:28:45.475 }, 00:28:45.475 "memory_domains": [ 00:28:45.475 { 00:28:45.475 "dma_device_id": "system", 00:28:45.475 "dma_device_type": 1 00:28:45.475 }, 00:28:45.475 { 00:28:45.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:45.475 "dma_device_type": 2 00:28:45.475 } 00:28:45.475 ], 00:28:45.475 "driver_specific": { 00:28:45.475 "passthru": { 00:28:45.475 "name": "pt1", 00:28:45.475 "base_bdev_name": "malloc1" 00:28:45.475 } 00:28:45.475 } 00:28:45.475 }' 00:28:45.475 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:45.475 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:45.475 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:45.475 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:45.733 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:46.004 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:46.004 "name": "pt2", 00:28:46.004 "aliases": [ 00:28:46.004 "00000000-0000-0000-0000-000000000002" 00:28:46.004 ], 00:28:46.004 "product_name": "passthru", 00:28:46.004 "block_size": 4096, 00:28:46.004 "num_blocks": 8192, 00:28:46.004 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:46.004 "md_size": 32, 00:28:46.004 "md_interleave": false, 00:28:46.004 "dif_type": 0, 00:28:46.004 "assigned_rate_limits": { 00:28:46.004 "rw_ios_per_sec": 0, 00:28:46.004 "rw_mbytes_per_sec": 0, 00:28:46.004 "r_mbytes_per_sec": 0, 00:28:46.004 "w_mbytes_per_sec": 0 00:28:46.004 }, 00:28:46.004 "claimed": true, 00:28:46.004 "claim_type": "exclusive_write", 00:28:46.004 "zoned": false, 00:28:46.004 "supported_io_types": { 00:28:46.004 "read": true, 00:28:46.004 "write": true, 00:28:46.004 "unmap": true, 00:28:46.004 "flush": true, 00:28:46.004 "reset": true, 00:28:46.004 "nvme_admin": false, 00:28:46.004 "nvme_io": false, 00:28:46.004 "nvme_io_md": false, 00:28:46.004 "write_zeroes": true, 00:28:46.004 "zcopy": true, 00:28:46.004 "get_zone_info": false, 00:28:46.004 "zone_management": false, 00:28:46.004 "zone_append": false, 00:28:46.004 "compare": false, 00:28:46.004 "compare_and_write": false, 00:28:46.004 "abort": true, 00:28:46.004 "seek_hole": false, 00:28:46.004 "seek_data": false, 00:28:46.004 "copy": true, 00:28:46.004 "nvme_iov_md": false 00:28:46.004 }, 00:28:46.004 "memory_domains": [ 00:28:46.004 { 00:28:46.004 "dma_device_id": "system", 00:28:46.004 "dma_device_type": 1 00:28:46.004 }, 00:28:46.004 { 00:28:46.004 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:46.004 "dma_device_type": 2 00:28:46.004 } 00:28:46.004 ], 00:28:46.004 "driver_specific": { 00:28:46.004 "passthru": { 00:28:46.004 "name": "pt2", 00:28:46.004 "base_bdev_name": "malloc2" 00:28:46.004 } 00:28:46.004 } 00:28:46.004 }' 00:28:46.004 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:46.263 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:46.263 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:46.263 22:57:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:46.263 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:46.263 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:46.263 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:46.263 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:46.263 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:46.263 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:46.521 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:46.521 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:46.521 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:46.521 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:46.781 [2024-07-15 22:57:31.467472] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:46.781 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' ec82a565-351c-497e-9eb7-458dd6f57e28 '!=' ec82a565-351c-497e-9eb7-458dd6f57e28 ']' 00:28:46.781 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:46.781 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:46.781 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:46.781 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:47.040 [2024-07-15 22:57:31.715884] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.040 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.299 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:47.299 "name": "raid_bdev1", 00:28:47.299 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:47.299 "strip_size_kb": 0, 00:28:47.299 "state": "online", 00:28:47.299 "raid_level": "raid1", 00:28:47.299 "superblock": true, 00:28:47.299 "num_base_bdevs": 2, 00:28:47.299 "num_base_bdevs_discovered": 1, 00:28:47.299 "num_base_bdevs_operational": 1, 00:28:47.299 "base_bdevs_list": [ 00:28:47.299 { 00:28:47.299 "name": null, 00:28:47.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:47.299 "is_configured": false, 00:28:47.299 "data_offset": 256, 00:28:47.299 "data_size": 7936 00:28:47.299 }, 00:28:47.299 { 00:28:47.299 "name": "pt2", 00:28:47.299 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:47.299 "is_configured": true, 00:28:47.299 "data_offset": 256, 00:28:47.299 "data_size": 7936 00:28:47.299 } 00:28:47.299 ] 00:28:47.299 }' 00:28:47.299 22:57:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:47.299 22:57:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:47.868 22:57:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:48.126 [2024-07-15 22:57:32.794731] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:48.126 [2024-07-15 22:57:32.794759] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:48.126 [2024-07-15 22:57:32.794815] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:48.126 [2024-07-15 22:57:32.794859] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:48.126 [2024-07-15 22:57:32.794871] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f25d0 name raid_bdev1, state offline 00:28:48.126 22:57:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.126 22:57:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:48.126 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:48.126 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:48.126 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:48.126 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:48.126 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:48.386 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:48.386 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:48.386 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:48.386 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:48.386 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:28:48.386 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:48.644 [2024-07-15 22:57:33.488540] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:48.644 [2024-07-15 22:57:33.488585] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:48.644 [2024-07-15 22:57:33.488603] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17f0660 00:28:48.644 [2024-07-15 22:57:33.488616] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:48.644 [2024-07-15 22:57:33.490064] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:48.644 [2024-07-15 22:57:33.490089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:48.644 [2024-07-15 22:57:33.490136] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:48.644 [2024-07-15 22:57:33.490162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:48.644 [2024-07-15 22:57:33.490240] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f2d10 00:28:48.644 [2024-07-15 22:57:33.490257] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:48.644 [2024-07-15 22:57:33.490311] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f3560 00:28:48.644 [2024-07-15 22:57:33.490407] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f2d10 00:28:48.644 [2024-07-15 22:57:33.490417] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17f2d10 00:28:48.644 [2024-07-15 22:57:33.490482] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:48.644 pt2 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.644 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.903 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.903 "name": "raid_bdev1", 00:28:48.903 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:48.903 "strip_size_kb": 0, 00:28:48.903 "state": "online", 00:28:48.903 "raid_level": "raid1", 00:28:48.903 "superblock": true, 00:28:48.903 "num_base_bdevs": 2, 00:28:48.903 "num_base_bdevs_discovered": 1, 00:28:48.903 "num_base_bdevs_operational": 1, 00:28:48.903 "base_bdevs_list": [ 00:28:48.903 { 00:28:48.903 "name": null, 00:28:48.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.903 "is_configured": false, 00:28:48.903 "data_offset": 256, 00:28:48.903 "data_size": 7936 00:28:48.903 }, 00:28:48.903 { 00:28:48.903 "name": "pt2", 00:28:48.903 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:48.903 "is_configured": true, 00:28:48.903 "data_offset": 256, 00:28:48.903 "data_size": 7936 00:28:48.903 } 00:28:48.903 ] 00:28:48.903 }' 00:28:48.903 22:57:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.903 22:57:33 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:49.470 22:57:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:49.729 [2024-07-15 22:57:34.595456] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:49.729 [2024-07-15 22:57:34.595482] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:49.729 [2024-07-15 22:57:34.595543] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:49.729 [2024-07-15 22:57:34.595587] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:49.729 [2024-07-15 22:57:34.595599] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f2d10 name raid_bdev1, state offline 00:28:49.729 22:57:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:49.729 22:57:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:49.987 22:57:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:49.987 22:57:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:49.987 22:57:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:49.987 22:57:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:50.246 [2024-07-15 22:57:35.084727] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:50.246 [2024-07-15 22:57:35.084773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:50.246 [2024-07-15 22:57:35.084791] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17f1760 00:28:50.246 [2024-07-15 22:57:35.084803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:50.246 [2024-07-15 22:57:35.086246] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:50.246 [2024-07-15 22:57:35.086276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:50.246 [2024-07-15 22:57:35.086326] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:50.246 [2024-07-15 22:57:35.086351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:50.246 [2024-07-15 22:57:35.086443] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:50.246 [2024-07-15 22:57:35.086456] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:50.246 [2024-07-15 22:57:35.086469] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f3850 name raid_bdev1, state configuring 00:28:50.246 [2024-07-15 22:57:35.086492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:50.246 [2024-07-15 22:57:35.086546] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17f2850 00:28:50.246 [2024-07-15 22:57:35.086556] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:50.246 [2024-07-15 22:57:35.086614] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17f33b0 00:28:50.246 [2024-07-15 22:57:35.086712] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17f2850 00:28:50.246 [2024-07-15 22:57:35.086722] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17f2850 00:28:50.246 [2024-07-15 22:57:35.086795] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:50.246 pt1 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.246 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:50.507 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:50.507 "name": "raid_bdev1", 00:28:50.507 "uuid": "ec82a565-351c-497e-9eb7-458dd6f57e28", 00:28:50.507 "strip_size_kb": 0, 00:28:50.507 "state": "online", 00:28:50.507 "raid_level": "raid1", 00:28:50.507 "superblock": true, 00:28:50.507 "num_base_bdevs": 2, 00:28:50.507 "num_base_bdevs_discovered": 1, 00:28:50.507 "num_base_bdevs_operational": 1, 00:28:50.507 "base_bdevs_list": [ 00:28:50.507 { 00:28:50.507 "name": null, 00:28:50.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:50.507 "is_configured": false, 00:28:50.507 "data_offset": 256, 00:28:50.507 "data_size": 7936 00:28:50.507 }, 00:28:50.507 { 00:28:50.507 "name": "pt2", 00:28:50.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:50.507 "is_configured": true, 00:28:50.507 "data_offset": 256, 00:28:50.507 "data_size": 7936 00:28:50.507 } 00:28:50.507 ] 00:28:50.507 }' 00:28:50.507 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:50.507 22:57:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:51.442 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:51.442 22:57:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:51.442 22:57:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:51.442 22:57:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:51.442 22:57:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:51.700 [2024-07-15 22:57:36.464639] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' ec82a565-351c-497e-9eb7-458dd6f57e28 '!=' ec82a565-351c-497e-9eb7-458dd6f57e28 ']' 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2851615 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2851615 ']' 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 2851615 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2851615 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2851615' 00:28:51.700 killing process with pid 2851615 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 2851615 00:28:51.700 [2024-07-15 22:57:36.533034] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:51.700 [2024-07-15 22:57:36.533095] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:51.700 [2024-07-15 22:57:36.533144] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:51.700 [2024-07-15 22:57:36.533156] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17f2850 name raid_bdev1, state offline 00:28:51.700 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 2851615 00:28:51.700 [2024-07-15 22:57:36.559276] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:51.958 22:57:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:28:51.958 00:28:51.958 real 0m15.761s 00:28:51.958 user 0m28.589s 00:28:51.958 sys 0m2.923s 00:28:51.958 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:51.958 22:57:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:51.958 ************************************ 00:28:51.958 END TEST raid_superblock_test_md_separate 00:28:51.958 ************************************ 00:28:51.958 22:57:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:51.958 22:57:36 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:28:51.958 22:57:36 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:28:51.958 22:57:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:51.958 22:57:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:51.958 22:57:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:51.958 ************************************ 00:28:51.958 START TEST raid_rebuild_test_sb_md_separate 00:28:51.958 ************************************ 00:28:51.958 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:51.958 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:51.958 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:51.958 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:51.958 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:51.958 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2853878 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2853878 /var/tmp/spdk-raid.sock 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2853878 ']' 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:52.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:52.217 22:57:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:52.217 [2024-07-15 22:57:36.932606] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:28:52.217 [2024-07-15 22:57:36.932677] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2853878 ] 00:28:52.217 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:52.217 Zero copy mechanism will not be used. 00:28:52.217 [2024-07-15 22:57:37.063697] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:52.475 [2024-07-15 22:57:37.166221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:52.475 [2024-07-15 22:57:37.227857] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:52.475 [2024-07-15 22:57:37.227896] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:53.041 22:57:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:53.041 22:57:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:53.041 22:57:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:53.041 22:57:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:28:53.299 BaseBdev1_malloc 00:28:53.299 22:57:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:53.557 [2024-07-15 22:57:38.347709] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:53.557 [2024-07-15 22:57:38.347757] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:53.557 [2024-07-15 22:57:38.347780] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22316d0 00:28:53.557 [2024-07-15 22:57:38.347793] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:53.557 [2024-07-15 22:57:38.349141] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:53.557 [2024-07-15 22:57:38.349168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:53.557 BaseBdev1 00:28:53.557 22:57:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:53.557 22:57:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:28:53.816 BaseBdev2_malloc 00:28:53.816 22:57:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:54.074 [2024-07-15 22:57:38.854503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:54.075 [2024-07-15 22:57:38.854549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:54.075 [2024-07-15 22:57:38.854570] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23891f0 00:28:54.075 [2024-07-15 22:57:38.854583] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:54.075 [2024-07-15 22:57:38.855826] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:54.075 [2024-07-15 22:57:38.855852] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:54.075 BaseBdev2 00:28:54.075 22:57:38 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:28:54.334 spare_malloc 00:28:54.334 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:54.593 spare_delay 00:28:54.593 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:54.852 [2024-07-15 22:57:39.577770] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:54.852 [2024-07-15 22:57:39.577819] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:54.852 [2024-07-15 22:57:39.577841] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23857a0 00:28:54.852 [2024-07-15 22:57:39.577853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:54.852 [2024-07-15 22:57:39.579154] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:54.852 [2024-07-15 22:57:39.579181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:54.852 spare 00:28:54.852 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:55.110 [2024-07-15 22:57:39.830475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:55.110 [2024-07-15 22:57:39.831751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:55.110 [2024-07-15 22:57:39.831916] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23861c0 00:28:55.110 [2024-07-15 22:57:39.831938] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:55.110 [2024-07-15 22:57:39.832013] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2297360 00:28:55.110 [2024-07-15 22:57:39.832140] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23861c0 00:28:55.110 [2024-07-15 22:57:39.832151] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23861c0 00:28:55.110 [2024-07-15 22:57:39.832220] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:55.110 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:55.111 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:55.111 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:55.111 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:55.111 22:57:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:55.368 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:55.368 "name": "raid_bdev1", 00:28:55.368 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:28:55.368 "strip_size_kb": 0, 00:28:55.368 "state": "online", 00:28:55.368 "raid_level": "raid1", 00:28:55.368 "superblock": true, 00:28:55.368 "num_base_bdevs": 2, 00:28:55.368 "num_base_bdevs_discovered": 2, 00:28:55.368 "num_base_bdevs_operational": 2, 00:28:55.368 "base_bdevs_list": [ 00:28:55.368 { 00:28:55.368 "name": "BaseBdev1", 00:28:55.368 "uuid": "dde37320-9bea-5231-8054-3927f352c367", 00:28:55.368 "is_configured": true, 00:28:55.368 "data_offset": 256, 00:28:55.368 "data_size": 7936 00:28:55.368 }, 00:28:55.368 { 00:28:55.368 "name": "BaseBdev2", 00:28:55.368 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:28:55.368 "is_configured": true, 00:28:55.368 "data_offset": 256, 00:28:55.368 "data_size": 7936 00:28:55.368 } 00:28:55.368 ] 00:28:55.368 }' 00:28:55.368 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:55.368 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:55.934 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:55.934 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:56.192 [2024-07-15 22:57:40.937688] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:56.192 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:56.192 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.192 22:57:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:56.450 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:56.708 [2024-07-15 22:57:41.438800] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2297360 00:28:56.708 /dev/nbd0 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:56.708 1+0 records in 00:28:56.708 1+0 records out 00:28:56.708 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258928 s, 15.8 MB/s 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:56.708 22:57:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:57.643 7936+0 records in 00:28:57.643 7936+0 records out 00:28:57.643 32505856 bytes (33 MB, 31 MiB) copied, 0.757098 s, 42.9 MB/s 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:57.643 [2024-07-15 22:57:42.530323] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:57.643 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:57.902 [2024-07-15 22:57:42.771004] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.902 22:57:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.161 22:57:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:58.161 "name": "raid_bdev1", 00:28:58.161 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:28:58.161 "strip_size_kb": 0, 00:28:58.161 "state": "online", 00:28:58.161 "raid_level": "raid1", 00:28:58.161 "superblock": true, 00:28:58.161 "num_base_bdevs": 2, 00:28:58.161 "num_base_bdevs_discovered": 1, 00:28:58.161 "num_base_bdevs_operational": 1, 00:28:58.161 "base_bdevs_list": [ 00:28:58.161 { 00:28:58.161 "name": null, 00:28:58.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.161 "is_configured": false, 00:28:58.161 "data_offset": 256, 00:28:58.161 "data_size": 7936 00:28:58.161 }, 00:28:58.161 { 00:28:58.161 "name": "BaseBdev2", 00:28:58.161 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:28:58.161 "is_configured": true, 00:28:58.161 "data_offset": 256, 00:28:58.161 "data_size": 7936 00:28:58.161 } 00:28:58.161 ] 00:28:58.161 }' 00:28:58.161 22:57:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:58.161 22:57:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:58.727 22:57:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:58.984 [2024-07-15 22:57:43.853876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:58.984 [2024-07-15 22:57:43.856225] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2230350 00:28:58.984 [2024-07-15 22:57:43.858557] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:58.984 22:57:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:00.356 22:57:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:00.356 22:57:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:00.356 22:57:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:00.356 22:57:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:00.356 22:57:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:00.356 22:57:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.356 22:57:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.356 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:00.356 "name": "raid_bdev1", 00:29:00.356 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:00.356 "strip_size_kb": 0, 00:29:00.356 "state": "online", 00:29:00.356 "raid_level": "raid1", 00:29:00.356 "superblock": true, 00:29:00.356 "num_base_bdevs": 2, 00:29:00.356 "num_base_bdevs_discovered": 2, 00:29:00.357 "num_base_bdevs_operational": 2, 00:29:00.357 "process": { 00:29:00.357 "type": "rebuild", 00:29:00.357 "target": "spare", 00:29:00.357 "progress": { 00:29:00.357 "blocks": 3072, 00:29:00.357 "percent": 38 00:29:00.357 } 00:29:00.357 }, 00:29:00.357 "base_bdevs_list": [ 00:29:00.357 { 00:29:00.357 "name": "spare", 00:29:00.357 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:00.357 "is_configured": true, 00:29:00.357 "data_offset": 256, 00:29:00.357 "data_size": 7936 00:29:00.357 }, 00:29:00.357 { 00:29:00.357 "name": "BaseBdev2", 00:29:00.357 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:00.357 "is_configured": true, 00:29:00.357 "data_offset": 256, 00:29:00.357 "data_size": 7936 00:29:00.357 } 00:29:00.357 ] 00:29:00.357 }' 00:29:00.357 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:00.357 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:00.357 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:00.357 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:00.357 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:00.615 [2024-07-15 22:57:45.460131] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:00.615 [2024-07-15 22:57:45.470821] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:00.615 [2024-07-15 22:57:45.470867] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:00.615 [2024-07-15 22:57:45.470883] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:00.615 [2024-07-15 22:57:45.470892] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.615 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.873 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:00.873 "name": "raid_bdev1", 00:29:00.873 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:00.873 "strip_size_kb": 0, 00:29:00.873 "state": "online", 00:29:00.873 "raid_level": "raid1", 00:29:00.873 "superblock": true, 00:29:00.873 "num_base_bdevs": 2, 00:29:00.873 "num_base_bdevs_discovered": 1, 00:29:00.873 "num_base_bdevs_operational": 1, 00:29:00.873 "base_bdevs_list": [ 00:29:00.873 { 00:29:00.873 "name": null, 00:29:00.873 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:00.873 "is_configured": false, 00:29:00.873 "data_offset": 256, 00:29:00.873 "data_size": 7936 00:29:00.873 }, 00:29:00.873 { 00:29:00.873 "name": "BaseBdev2", 00:29:00.873 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:00.873 "is_configured": true, 00:29:00.873 "data_offset": 256, 00:29:00.873 "data_size": 7936 00:29:00.873 } 00:29:00.873 ] 00:29:00.873 }' 00:29:00.873 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:00.873 22:57:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:01.451 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:01.451 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:01.451 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:01.451 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:01.451 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:01.452 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.452 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.709 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:01.709 "name": "raid_bdev1", 00:29:01.709 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:01.709 "strip_size_kb": 0, 00:29:01.709 "state": "online", 00:29:01.709 "raid_level": "raid1", 00:29:01.709 "superblock": true, 00:29:01.709 "num_base_bdevs": 2, 00:29:01.709 "num_base_bdevs_discovered": 1, 00:29:01.709 "num_base_bdevs_operational": 1, 00:29:01.709 "base_bdevs_list": [ 00:29:01.709 { 00:29:01.709 "name": null, 00:29:01.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:01.709 "is_configured": false, 00:29:01.709 "data_offset": 256, 00:29:01.709 "data_size": 7936 00:29:01.709 }, 00:29:01.709 { 00:29:01.709 "name": "BaseBdev2", 00:29:01.709 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:01.710 "is_configured": true, 00:29:01.710 "data_offset": 256, 00:29:01.710 "data_size": 7936 00:29:01.710 } 00:29:01.710 ] 00:29:01.710 }' 00:29:01.710 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:01.968 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:01.968 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:01.968 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:01.968 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:02.226 [2024-07-15 22:57:46.917701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:02.226 [2024-07-15 22:57:46.920320] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2231280 00:29:02.226 [2024-07-15 22:57:46.921940] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:02.226 22:57:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:03.160 22:57:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:03.160 22:57:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:03.160 22:57:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:03.160 22:57:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:03.160 22:57:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:03.160 22:57:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.160 22:57:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:03.418 "name": "raid_bdev1", 00:29:03.418 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:03.418 "strip_size_kb": 0, 00:29:03.418 "state": "online", 00:29:03.418 "raid_level": "raid1", 00:29:03.418 "superblock": true, 00:29:03.418 "num_base_bdevs": 2, 00:29:03.418 "num_base_bdevs_discovered": 2, 00:29:03.418 "num_base_bdevs_operational": 2, 00:29:03.418 "process": { 00:29:03.418 "type": "rebuild", 00:29:03.418 "target": "spare", 00:29:03.418 "progress": { 00:29:03.418 "blocks": 3072, 00:29:03.418 "percent": 38 00:29:03.418 } 00:29:03.418 }, 00:29:03.418 "base_bdevs_list": [ 00:29:03.418 { 00:29:03.418 "name": "spare", 00:29:03.418 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:03.418 "is_configured": true, 00:29:03.418 "data_offset": 256, 00:29:03.418 "data_size": 7936 00:29:03.418 }, 00:29:03.418 { 00:29:03.418 "name": "BaseBdev2", 00:29:03.418 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:03.418 "is_configured": true, 00:29:03.418 "data_offset": 256, 00:29:03.418 "data_size": 7936 00:29:03.418 } 00:29:03.418 ] 00:29:03.418 }' 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:03.418 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1116 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.418 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.676 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:03.676 "name": "raid_bdev1", 00:29:03.676 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:03.676 "strip_size_kb": 0, 00:29:03.676 "state": "online", 00:29:03.676 "raid_level": "raid1", 00:29:03.676 "superblock": true, 00:29:03.676 "num_base_bdevs": 2, 00:29:03.676 "num_base_bdevs_discovered": 2, 00:29:03.676 "num_base_bdevs_operational": 2, 00:29:03.676 "process": { 00:29:03.676 "type": "rebuild", 00:29:03.676 "target": "spare", 00:29:03.676 "progress": { 00:29:03.676 "blocks": 3840, 00:29:03.676 "percent": 48 00:29:03.676 } 00:29:03.676 }, 00:29:03.676 "base_bdevs_list": [ 00:29:03.676 { 00:29:03.676 "name": "spare", 00:29:03.676 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:03.676 "is_configured": true, 00:29:03.676 "data_offset": 256, 00:29:03.676 "data_size": 7936 00:29:03.676 }, 00:29:03.676 { 00:29:03.676 "name": "BaseBdev2", 00:29:03.676 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:03.676 "is_configured": true, 00:29:03.676 "data_offset": 256, 00:29:03.676 "data_size": 7936 00:29:03.676 } 00:29:03.676 ] 00:29:03.676 }' 00:29:03.676 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:03.676 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:03.934 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:03.934 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:03.934 22:57:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.868 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:05.127 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:05.127 "name": "raid_bdev1", 00:29:05.127 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:05.127 "strip_size_kb": 0, 00:29:05.127 "state": "online", 00:29:05.127 "raid_level": "raid1", 00:29:05.127 "superblock": true, 00:29:05.127 "num_base_bdevs": 2, 00:29:05.127 "num_base_bdevs_discovered": 2, 00:29:05.127 "num_base_bdevs_operational": 2, 00:29:05.127 "process": { 00:29:05.127 "type": "rebuild", 00:29:05.127 "target": "spare", 00:29:05.127 "progress": { 00:29:05.127 "blocks": 7424, 00:29:05.127 "percent": 93 00:29:05.127 } 00:29:05.127 }, 00:29:05.127 "base_bdevs_list": [ 00:29:05.127 { 00:29:05.127 "name": "spare", 00:29:05.127 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:05.127 "is_configured": true, 00:29:05.127 "data_offset": 256, 00:29:05.127 "data_size": 7936 00:29:05.127 }, 00:29:05.127 { 00:29:05.127 "name": "BaseBdev2", 00:29:05.127 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:05.127 "is_configured": true, 00:29:05.127 "data_offset": 256, 00:29:05.127 "data_size": 7936 00:29:05.127 } 00:29:05.127 ] 00:29:05.127 }' 00:29:05.127 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:05.127 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:05.127 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:05.127 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:05.127 22:57:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:05.385 [2024-07-15 22:57:50.046418] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:05.385 [2024-07-15 22:57:50.046478] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:05.385 [2024-07-15 22:57:50.046565] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.318 22:57:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:06.577 "name": "raid_bdev1", 00:29:06.577 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:06.577 "strip_size_kb": 0, 00:29:06.577 "state": "online", 00:29:06.577 "raid_level": "raid1", 00:29:06.577 "superblock": true, 00:29:06.577 "num_base_bdevs": 2, 00:29:06.577 "num_base_bdevs_discovered": 2, 00:29:06.577 "num_base_bdevs_operational": 2, 00:29:06.577 "base_bdevs_list": [ 00:29:06.577 { 00:29:06.577 "name": "spare", 00:29:06.577 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:06.577 "is_configured": true, 00:29:06.577 "data_offset": 256, 00:29:06.577 "data_size": 7936 00:29:06.577 }, 00:29:06.577 { 00:29:06.577 "name": "BaseBdev2", 00:29:06.577 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:06.577 "is_configured": true, 00:29:06.577 "data_offset": 256, 00:29:06.577 "data_size": 7936 00:29:06.577 } 00:29:06.577 ] 00:29:06.577 }' 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.577 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.145 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:07.145 "name": "raid_bdev1", 00:29:07.145 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:07.145 "strip_size_kb": 0, 00:29:07.145 "state": "online", 00:29:07.145 "raid_level": "raid1", 00:29:07.145 "superblock": true, 00:29:07.145 "num_base_bdevs": 2, 00:29:07.145 "num_base_bdevs_discovered": 2, 00:29:07.145 "num_base_bdevs_operational": 2, 00:29:07.145 "base_bdevs_list": [ 00:29:07.145 { 00:29:07.145 "name": "spare", 00:29:07.145 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:07.145 "is_configured": true, 00:29:07.145 "data_offset": 256, 00:29:07.145 "data_size": 7936 00:29:07.145 }, 00:29:07.145 { 00:29:07.145 "name": "BaseBdev2", 00:29:07.145 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:07.145 "is_configured": true, 00:29:07.145 "data_offset": 256, 00:29:07.145 "data_size": 7936 00:29:07.145 } 00:29:07.145 ] 00:29:07.145 }' 00:29:07.145 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:07.145 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:07.145 22:57:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.145 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.405 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:07.405 "name": "raid_bdev1", 00:29:07.405 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:07.405 "strip_size_kb": 0, 00:29:07.405 "state": "online", 00:29:07.405 "raid_level": "raid1", 00:29:07.405 "superblock": true, 00:29:07.405 "num_base_bdevs": 2, 00:29:07.405 "num_base_bdevs_discovered": 2, 00:29:07.405 "num_base_bdevs_operational": 2, 00:29:07.405 "base_bdevs_list": [ 00:29:07.405 { 00:29:07.405 "name": "spare", 00:29:07.405 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:07.405 "is_configured": true, 00:29:07.405 "data_offset": 256, 00:29:07.405 "data_size": 7936 00:29:07.405 }, 00:29:07.405 { 00:29:07.405 "name": "BaseBdev2", 00:29:07.405 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:07.405 "is_configured": true, 00:29:07.405 "data_offset": 256, 00:29:07.405 "data_size": 7936 00:29:07.405 } 00:29:07.405 ] 00:29:07.405 }' 00:29:07.405 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:07.405 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:07.970 22:57:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:08.228 [2024-07-15 22:57:53.041772] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:08.228 [2024-07-15 22:57:53.041803] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:08.228 [2024-07-15 22:57:53.041864] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:08.228 [2024-07-15 22:57:53.041923] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:08.228 [2024-07-15 22:57:53.041942] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23861c0 name raid_bdev1, state offline 00:29:08.228 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.228 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:08.487 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:29:08.746 /dev/nbd0 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:08.746 1+0 records in 00:29:08.746 1+0 records out 00:29:08.746 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274911 s, 14.9 MB/s 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:08.746 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:29:09.005 /dev/nbd1 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:09.005 1+0 records in 00:29:09.005 1+0 records out 00:29:09.005 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000353249 s, 11.6 MB/s 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:09.005 22:57:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:09.264 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:09.264 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:09.264 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:09.264 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:09.264 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:09.264 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:09.522 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:09.522 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:09.522 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:09.522 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:09.780 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:10.038 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:10.038 [2024-07-15 22:57:54.924421] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:10.038 [2024-07-15 22:57:54.924466] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:10.038 [2024-07-15 22:57:54.924489] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23859d0 00:29:10.038 [2024-07-15 22:57:54.924502] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:10.038 [2024-07-15 22:57:54.925979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:10.038 [2024-07-15 22:57:54.926007] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:10.038 [2024-07-15 22:57:54.926070] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:10.038 [2024-07-15 22:57:54.926100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:10.038 [2024-07-15 22:57:54.926196] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:10.038 spare 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.296 22:57:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.296 [2024-07-15 22:57:55.026506] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22977c0 00:29:10.296 [2024-07-15 22:57:55.026526] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:10.296 [2024-07-15 22:57:55.026604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2387cd0 00:29:10.296 [2024-07-15 22:57:55.026730] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22977c0 00:29:10.296 [2024-07-15 22:57:55.026741] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x22977c0 00:29:10.296 [2024-07-15 22:57:55.026819] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:10.296 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:10.296 "name": "raid_bdev1", 00:29:10.296 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:10.296 "strip_size_kb": 0, 00:29:10.296 "state": "online", 00:29:10.296 "raid_level": "raid1", 00:29:10.296 "superblock": true, 00:29:10.296 "num_base_bdevs": 2, 00:29:10.296 "num_base_bdevs_discovered": 2, 00:29:10.296 "num_base_bdevs_operational": 2, 00:29:10.296 "base_bdevs_list": [ 00:29:10.296 { 00:29:10.296 "name": "spare", 00:29:10.296 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:10.296 "is_configured": true, 00:29:10.296 "data_offset": 256, 00:29:10.296 "data_size": 7936 00:29:10.296 }, 00:29:10.296 { 00:29:10.296 "name": "BaseBdev2", 00:29:10.296 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:10.296 "is_configured": true, 00:29:10.296 "data_offset": 256, 00:29:10.296 "data_size": 7936 00:29:10.296 } 00:29:10.296 ] 00:29:10.296 }' 00:29:10.296 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:10.296 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:11.231 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:11.231 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:11.231 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:11.231 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:11.231 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:11.231 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.231 22:57:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.231 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:11.231 "name": "raid_bdev1", 00:29:11.231 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:11.231 "strip_size_kb": 0, 00:29:11.231 "state": "online", 00:29:11.231 "raid_level": "raid1", 00:29:11.231 "superblock": true, 00:29:11.231 "num_base_bdevs": 2, 00:29:11.231 "num_base_bdevs_discovered": 2, 00:29:11.231 "num_base_bdevs_operational": 2, 00:29:11.231 "base_bdevs_list": [ 00:29:11.231 { 00:29:11.231 "name": "spare", 00:29:11.231 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:11.231 "is_configured": true, 00:29:11.231 "data_offset": 256, 00:29:11.231 "data_size": 7936 00:29:11.231 }, 00:29:11.231 { 00:29:11.231 "name": "BaseBdev2", 00:29:11.231 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:11.231 "is_configured": true, 00:29:11.231 "data_offset": 256, 00:29:11.231 "data_size": 7936 00:29:11.231 } 00:29:11.231 ] 00:29:11.231 }' 00:29:11.231 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:11.231 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:11.231 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:11.488 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:11.488 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.488 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:12.054 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:12.054 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:12.054 [2024-07-15 22:57:56.933867] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:12.054 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:12.054 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.055 22:57:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.619 22:57:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.619 "name": "raid_bdev1", 00:29:12.620 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:12.620 "strip_size_kb": 0, 00:29:12.620 "state": "online", 00:29:12.620 "raid_level": "raid1", 00:29:12.620 "superblock": true, 00:29:12.620 "num_base_bdevs": 2, 00:29:12.620 "num_base_bdevs_discovered": 1, 00:29:12.620 "num_base_bdevs_operational": 1, 00:29:12.620 "base_bdevs_list": [ 00:29:12.620 { 00:29:12.620 "name": null, 00:29:12.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:12.620 "is_configured": false, 00:29:12.620 "data_offset": 256, 00:29:12.620 "data_size": 7936 00:29:12.620 }, 00:29:12.620 { 00:29:12.620 "name": "BaseBdev2", 00:29:12.620 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:12.620 "is_configured": true, 00:29:12.620 "data_offset": 256, 00:29:12.620 "data_size": 7936 00:29:12.620 } 00:29:12.620 ] 00:29:12.620 }' 00:29:12.620 22:57:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.620 22:57:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:13.557 22:57:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:13.816 [2024-07-15 22:57:58.698578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:13.816 [2024-07-15 22:57:58.698744] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:13.816 [2024-07-15 22:57:58.698762] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:13.816 [2024-07-15 22:57:58.698790] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:13.816 [2024-07-15 22:57:58.701000] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22301d0 00:29:13.816 [2024-07-15 22:57:58.702365] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:14.075 22:57:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:15.013 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:15.013 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:15.013 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:15.013 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:15.013 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:15.013 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.013 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.272 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:15.272 "name": "raid_bdev1", 00:29:15.272 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:15.272 "strip_size_kb": 0, 00:29:15.272 "state": "online", 00:29:15.272 "raid_level": "raid1", 00:29:15.272 "superblock": true, 00:29:15.272 "num_base_bdevs": 2, 00:29:15.272 "num_base_bdevs_discovered": 2, 00:29:15.272 "num_base_bdevs_operational": 2, 00:29:15.272 "process": { 00:29:15.272 "type": "rebuild", 00:29:15.272 "target": "spare", 00:29:15.272 "progress": { 00:29:15.272 "blocks": 3072, 00:29:15.272 "percent": 38 00:29:15.272 } 00:29:15.272 }, 00:29:15.272 "base_bdevs_list": [ 00:29:15.272 { 00:29:15.272 "name": "spare", 00:29:15.272 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:15.272 "is_configured": true, 00:29:15.272 "data_offset": 256, 00:29:15.272 "data_size": 7936 00:29:15.272 }, 00:29:15.272 { 00:29:15.272 "name": "BaseBdev2", 00:29:15.272 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:15.272 "is_configured": true, 00:29:15.272 "data_offset": 256, 00:29:15.272 "data_size": 7936 00:29:15.272 } 00:29:15.272 ] 00:29:15.272 }' 00:29:15.272 22:57:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:15.272 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:15.272 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:15.272 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:15.272 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:15.840 [2024-07-15 22:58:00.565047] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:15.840 [2024-07-15 22:58:00.617603] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:15.840 [2024-07-15 22:58:00.617652] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:15.840 [2024-07-15 22:58:00.617673] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:15.840 [2024-07-15 22:58:00.617682] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.840 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:16.099 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:16.099 "name": "raid_bdev1", 00:29:16.099 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:16.099 "strip_size_kb": 0, 00:29:16.099 "state": "online", 00:29:16.099 "raid_level": "raid1", 00:29:16.099 "superblock": true, 00:29:16.099 "num_base_bdevs": 2, 00:29:16.099 "num_base_bdevs_discovered": 1, 00:29:16.099 "num_base_bdevs_operational": 1, 00:29:16.099 "base_bdevs_list": [ 00:29:16.099 { 00:29:16.099 "name": null, 00:29:16.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:16.099 "is_configured": false, 00:29:16.099 "data_offset": 256, 00:29:16.099 "data_size": 7936 00:29:16.099 }, 00:29:16.099 { 00:29:16.099 "name": "BaseBdev2", 00:29:16.099 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:16.099 "is_configured": true, 00:29:16.099 "data_offset": 256, 00:29:16.099 "data_size": 7936 00:29:16.099 } 00:29:16.099 ] 00:29:16.099 }' 00:29:16.099 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:16.099 22:58:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:17.036 22:58:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:17.295 [2024-07-15 22:58:02.029704] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:17.295 [2024-07-15 22:58:02.029762] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:17.295 [2024-07-15 22:58:02.029785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ca810 00:29:17.295 [2024-07-15 22:58:02.029799] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.295 [2024-07-15 22:58:02.030052] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.295 [2024-07-15 22:58:02.030070] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:17.295 [2024-07-15 22:58:02.030135] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:17.295 [2024-07-15 22:58:02.030148] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:17.295 [2024-07-15 22:58:02.030160] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:17.295 [2024-07-15 22:58:02.030179] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:17.295 [2024-07-15 22:58:02.032731] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2230980 00:29:17.295 [2024-07-15 22:58:02.034151] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:17.295 spare 00:29:17.295 22:58:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:18.228 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:18.228 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:18.228 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:18.228 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:18.228 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:18.228 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.228 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:18.491 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:18.491 "name": "raid_bdev1", 00:29:18.491 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:18.491 "strip_size_kb": 0, 00:29:18.491 "state": "online", 00:29:18.491 "raid_level": "raid1", 00:29:18.491 "superblock": true, 00:29:18.491 "num_base_bdevs": 2, 00:29:18.491 "num_base_bdevs_discovered": 2, 00:29:18.491 "num_base_bdevs_operational": 2, 00:29:18.491 "process": { 00:29:18.491 "type": "rebuild", 00:29:18.491 "target": "spare", 00:29:18.491 "progress": { 00:29:18.491 "blocks": 2816, 00:29:18.491 "percent": 35 00:29:18.491 } 00:29:18.491 }, 00:29:18.491 "base_bdevs_list": [ 00:29:18.491 { 00:29:18.491 "name": "spare", 00:29:18.491 "uuid": "01c9f154-b0cd-53e4-b33d-6f6d28b99918", 00:29:18.491 "is_configured": true, 00:29:18.491 "data_offset": 256, 00:29:18.491 "data_size": 7936 00:29:18.491 }, 00:29:18.491 { 00:29:18.491 "name": "BaseBdev2", 00:29:18.491 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:18.491 "is_configured": true, 00:29:18.491 "data_offset": 256, 00:29:18.491 "data_size": 7936 00:29:18.491 } 00:29:18.491 ] 00:29:18.491 }' 00:29:18.491 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:18.491 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:18.491 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:18.491 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:18.491 22:58:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:19.429 [2024-07-15 22:58:04.048790] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:19.429 [2024-07-15 22:58:04.049738] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:19.429 [2024-07-15 22:58:04.049783] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:19.429 [2024-07-15 22:58:04.049799] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:19.429 [2024-07-15 22:58:04.049808] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:19.429 "name": "raid_bdev1", 00:29:19.429 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:19.429 "strip_size_kb": 0, 00:29:19.429 "state": "online", 00:29:19.429 "raid_level": "raid1", 00:29:19.429 "superblock": true, 00:29:19.429 "num_base_bdevs": 2, 00:29:19.429 "num_base_bdevs_discovered": 1, 00:29:19.429 "num_base_bdevs_operational": 1, 00:29:19.429 "base_bdevs_list": [ 00:29:19.429 { 00:29:19.429 "name": null, 00:29:19.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:19.429 "is_configured": false, 00:29:19.429 "data_offset": 256, 00:29:19.429 "data_size": 7936 00:29:19.429 }, 00:29:19.429 { 00:29:19.429 "name": "BaseBdev2", 00:29:19.429 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:19.429 "is_configured": true, 00:29:19.429 "data_offset": 256, 00:29:19.429 "data_size": 7936 00:29:19.429 } 00:29:19.429 ] 00:29:19.429 }' 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:19.429 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:20.365 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:20.365 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:20.365 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:20.365 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:20.365 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:20.365 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.365 22:58:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.365 22:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:20.365 "name": "raid_bdev1", 00:29:20.365 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:20.365 "strip_size_kb": 0, 00:29:20.365 "state": "online", 00:29:20.365 "raid_level": "raid1", 00:29:20.365 "superblock": true, 00:29:20.365 "num_base_bdevs": 2, 00:29:20.365 "num_base_bdevs_discovered": 1, 00:29:20.365 "num_base_bdevs_operational": 1, 00:29:20.365 "base_bdevs_list": [ 00:29:20.365 { 00:29:20.365 "name": null, 00:29:20.365 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:20.365 "is_configured": false, 00:29:20.365 "data_offset": 256, 00:29:20.365 "data_size": 7936 00:29:20.365 }, 00:29:20.365 { 00:29:20.365 "name": "BaseBdev2", 00:29:20.365 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:20.365 "is_configured": true, 00:29:20.365 "data_offset": 256, 00:29:20.365 "data_size": 7936 00:29:20.365 } 00:29:20.365 ] 00:29:20.365 }' 00:29:20.365 22:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:20.365 22:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:20.365 22:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:20.622 22:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:20.622 22:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:20.904 22:58:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:21.470 [2024-07-15 22:58:06.283245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:21.470 [2024-07-15 22:58:06.283300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:21.470 [2024-07-15 22:58:06.283328] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2231900 00:29:21.470 [2024-07-15 22:58:06.283340] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:21.470 [2024-07-15 22:58:06.283552] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:21.470 [2024-07-15 22:58:06.283568] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:21.470 [2024-07-15 22:58:06.283618] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:21.470 [2024-07-15 22:58:06.283631] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:21.470 [2024-07-15 22:58:06.283642] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:21.470 BaseBdev1 00:29:21.470 22:58:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.841 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.100 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.100 "name": "raid_bdev1", 00:29:23.100 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:23.100 "strip_size_kb": 0, 00:29:23.100 "state": "online", 00:29:23.100 "raid_level": "raid1", 00:29:23.100 "superblock": true, 00:29:23.100 "num_base_bdevs": 2, 00:29:23.100 "num_base_bdevs_discovered": 1, 00:29:23.100 "num_base_bdevs_operational": 1, 00:29:23.100 "base_bdevs_list": [ 00:29:23.100 { 00:29:23.100 "name": null, 00:29:23.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.100 "is_configured": false, 00:29:23.100 "data_offset": 256, 00:29:23.100 "data_size": 7936 00:29:23.100 }, 00:29:23.100 { 00:29:23.100 "name": "BaseBdev2", 00:29:23.100 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:23.100 "is_configured": true, 00:29:23.100 "data_offset": 256, 00:29:23.100 "data_size": 7936 00:29:23.100 } 00:29:23.100 ] 00:29:23.100 }' 00:29:23.100 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.100 22:58:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:23.666 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:23.666 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:23.666 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:23.666 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:23.666 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:23.666 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.666 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:23.925 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:23.925 "name": "raid_bdev1", 00:29:23.925 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:23.925 "strip_size_kb": 0, 00:29:23.925 "state": "online", 00:29:23.925 "raid_level": "raid1", 00:29:23.925 "superblock": true, 00:29:23.925 "num_base_bdevs": 2, 00:29:23.925 "num_base_bdevs_discovered": 1, 00:29:23.925 "num_base_bdevs_operational": 1, 00:29:23.925 "base_bdevs_list": [ 00:29:23.925 { 00:29:23.925 "name": null, 00:29:23.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.925 "is_configured": false, 00:29:23.925 "data_offset": 256, 00:29:23.925 "data_size": 7936 00:29:23.925 }, 00:29:23.925 { 00:29:23.925 "name": "BaseBdev2", 00:29:23.925 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:23.925 "is_configured": true, 00:29:23.925 "data_offset": 256, 00:29:23.925 "data_size": 7936 00:29:23.925 } 00:29:23.925 ] 00:29:23.925 }' 00:29:23.925 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:23.925 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:23.925 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:24.184 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:24.185 22:58:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:24.185 [2024-07-15 22:58:09.066646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:24.185 [2024-07-15 22:58:09.066779] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:24.185 [2024-07-15 22:58:09.066796] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:24.185 request: 00:29:24.185 { 00:29:24.185 "base_bdev": "BaseBdev1", 00:29:24.185 "raid_bdev": "raid_bdev1", 00:29:24.185 "method": "bdev_raid_add_base_bdev", 00:29:24.185 "req_id": 1 00:29:24.185 } 00:29:24.185 Got JSON-RPC error response 00:29:24.185 response: 00:29:24.185 { 00:29:24.185 "code": -22, 00:29:24.185 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:24.185 } 00:29:24.185 22:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:29:24.185 22:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:24.185 22:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:24.185 22:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:24.185 22:58:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.560 "name": "raid_bdev1", 00:29:25.560 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:25.560 "strip_size_kb": 0, 00:29:25.560 "state": "online", 00:29:25.560 "raid_level": "raid1", 00:29:25.560 "superblock": true, 00:29:25.560 "num_base_bdevs": 2, 00:29:25.560 "num_base_bdevs_discovered": 1, 00:29:25.560 "num_base_bdevs_operational": 1, 00:29:25.560 "base_bdevs_list": [ 00:29:25.560 { 00:29:25.560 "name": null, 00:29:25.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.560 "is_configured": false, 00:29:25.560 "data_offset": 256, 00:29:25.560 "data_size": 7936 00:29:25.560 }, 00:29:25.560 { 00:29:25.560 "name": "BaseBdev2", 00:29:25.560 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:25.560 "is_configured": true, 00:29:25.560 "data_offset": 256, 00:29:25.560 "data_size": 7936 00:29:25.560 } 00:29:25.560 ] 00:29:25.560 }' 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.560 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:26.127 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:26.127 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:26.127 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:26.127 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:26.127 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:26.127 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:26.127 22:58:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:26.384 "name": "raid_bdev1", 00:29:26.384 "uuid": "f733a47e-c56d-45e1-8f2f-ab7ccbda659d", 00:29:26.384 "strip_size_kb": 0, 00:29:26.384 "state": "online", 00:29:26.384 "raid_level": "raid1", 00:29:26.384 "superblock": true, 00:29:26.384 "num_base_bdevs": 2, 00:29:26.384 "num_base_bdevs_discovered": 1, 00:29:26.384 "num_base_bdevs_operational": 1, 00:29:26.384 "base_bdevs_list": [ 00:29:26.384 { 00:29:26.384 "name": null, 00:29:26.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:26.384 "is_configured": false, 00:29:26.384 "data_offset": 256, 00:29:26.384 "data_size": 7936 00:29:26.384 }, 00:29:26.384 { 00:29:26.384 "name": "BaseBdev2", 00:29:26.384 "uuid": "d647777c-050f-5bad-a72c-15ab71800294", 00:29:26.384 "is_configured": true, 00:29:26.384 "data_offset": 256, 00:29:26.384 "data_size": 7936 00:29:26.384 } 00:29:26.384 ] 00:29:26.384 }' 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2853878 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2853878 ']' 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2853878 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:26.384 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2853878 00:29:26.642 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:26.642 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:26.642 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2853878' 00:29:26.642 killing process with pid 2853878 00:29:26.642 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2853878 00:29:26.642 Received shutdown signal, test time was about 60.000000 seconds 00:29:26.642 00:29:26.642 Latency(us) 00:29:26.642 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:26.642 =================================================================================================================== 00:29:26.642 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:26.642 [2024-07-15 22:58:11.322578] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:26.642 [2024-07-15 22:58:11.322668] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:26.642 [2024-07-15 22:58:11.322714] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:26.642 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2853878 00:29:26.642 [2024-07-15 22:58:11.322727] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22977c0 name raid_bdev1, state offline 00:29:26.642 [2024-07-15 22:58:11.355986] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:26.930 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:29:26.930 00:29:26.930 real 0m34.699s 00:29:26.930 user 0m55.142s 00:29:26.930 sys 0m5.559s 00:29:26.930 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:26.930 22:58:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:26.930 ************************************ 00:29:26.930 END TEST raid_rebuild_test_sb_md_separate 00:29:26.930 ************************************ 00:29:26.930 22:58:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:26.930 22:58:11 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:29:26.930 22:58:11 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:29:26.930 22:58:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:26.930 22:58:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:26.930 22:58:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:26.930 ************************************ 00:29:26.930 START TEST raid_state_function_test_sb_md_interleaved 00:29:26.930 ************************************ 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2858865 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2858865' 00:29:26.930 Process raid pid: 2858865 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2858865 /var/tmp/spdk-raid.sock 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2858865 ']' 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:26.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:26.930 22:58:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:26.930 [2024-07-15 22:58:11.723436] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:29:26.930 [2024-07-15 22:58:11.723502] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:27.187 [2024-07-15 22:58:11.853775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:27.187 [2024-07-15 22:58:11.960024] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:27.187 [2024-07-15 22:58:12.024862] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:27.187 [2024-07-15 22:58:12.024899] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:27.752 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:27.752 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:27.752 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:28.011 [2024-07-15 22:58:12.880300] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:28.011 [2024-07-15 22:58:12.880340] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:28.011 [2024-07-15 22:58:12.880351] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:28.011 [2024-07-15 22:58:12.880363] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.011 22:58:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:28.269 22:58:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:28.269 "name": "Existed_Raid", 00:29:28.269 "uuid": "f89afbf6-6ae7-4178-8f82-f6501a58f998", 00:29:28.269 "strip_size_kb": 0, 00:29:28.269 "state": "configuring", 00:29:28.269 "raid_level": "raid1", 00:29:28.269 "superblock": true, 00:29:28.269 "num_base_bdevs": 2, 00:29:28.269 "num_base_bdevs_discovered": 0, 00:29:28.269 "num_base_bdevs_operational": 2, 00:29:28.269 "base_bdevs_list": [ 00:29:28.269 { 00:29:28.269 "name": "BaseBdev1", 00:29:28.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:28.269 "is_configured": false, 00:29:28.269 "data_offset": 0, 00:29:28.269 "data_size": 0 00:29:28.269 }, 00:29:28.269 { 00:29:28.269 "name": "BaseBdev2", 00:29:28.269 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:28.269 "is_configured": false, 00:29:28.269 "data_offset": 0, 00:29:28.269 "data_size": 0 00:29:28.269 } 00:29:28.269 ] 00:29:28.269 }' 00:29:28.269 22:58:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:28.269 22:58:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:29.201 22:58:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:29.201 [2024-07-15 22:58:13.975059] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:29.201 [2024-07-15 22:58:13.975089] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe04a80 name Existed_Raid, state configuring 00:29:29.201 22:58:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:29.459 [2024-07-15 22:58:14.223734] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:29.459 [2024-07-15 22:58:14.223761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:29.459 [2024-07-15 22:58:14.223770] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:29.459 [2024-07-15 22:58:14.223781] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:29.459 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:29:29.718 [2024-07-15 22:58:14.482273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:29.718 BaseBdev1 00:29:29.718 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:29.718 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:29:29.718 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:29.718 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:29.718 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:29.718 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:29.718 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:29.977 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:30.236 [ 00:29:30.236 { 00:29:30.236 "name": "BaseBdev1", 00:29:30.236 "aliases": [ 00:29:30.236 "81ebc727-dc44-4792-9fde-1f28435823d5" 00:29:30.236 ], 00:29:30.236 "product_name": "Malloc disk", 00:29:30.236 "block_size": 4128, 00:29:30.236 "num_blocks": 8192, 00:29:30.236 "uuid": "81ebc727-dc44-4792-9fde-1f28435823d5", 00:29:30.236 "md_size": 32, 00:29:30.236 "md_interleave": true, 00:29:30.236 "dif_type": 0, 00:29:30.236 "assigned_rate_limits": { 00:29:30.236 "rw_ios_per_sec": 0, 00:29:30.236 "rw_mbytes_per_sec": 0, 00:29:30.236 "r_mbytes_per_sec": 0, 00:29:30.236 "w_mbytes_per_sec": 0 00:29:30.236 }, 00:29:30.236 "claimed": true, 00:29:30.236 "claim_type": "exclusive_write", 00:29:30.236 "zoned": false, 00:29:30.236 "supported_io_types": { 00:29:30.236 "read": true, 00:29:30.236 "write": true, 00:29:30.236 "unmap": true, 00:29:30.236 "flush": true, 00:29:30.236 "reset": true, 00:29:30.236 "nvme_admin": false, 00:29:30.236 "nvme_io": false, 00:29:30.236 "nvme_io_md": false, 00:29:30.236 "write_zeroes": true, 00:29:30.236 "zcopy": true, 00:29:30.236 "get_zone_info": false, 00:29:30.236 "zone_management": false, 00:29:30.236 "zone_append": false, 00:29:30.236 "compare": false, 00:29:30.236 "compare_and_write": false, 00:29:30.236 "abort": true, 00:29:30.236 "seek_hole": false, 00:29:30.236 "seek_data": false, 00:29:30.236 "copy": true, 00:29:30.236 "nvme_iov_md": false 00:29:30.236 }, 00:29:30.236 "memory_domains": [ 00:29:30.236 { 00:29:30.236 "dma_device_id": "system", 00:29:30.236 "dma_device_type": 1 00:29:30.236 }, 00:29:30.236 { 00:29:30.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:30.236 "dma_device_type": 2 00:29:30.236 } 00:29:30.236 ], 00:29:30.236 "driver_specific": {} 00:29:30.236 } 00:29:30.236 ] 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.236 22:58:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:30.494 22:58:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:30.494 "name": "Existed_Raid", 00:29:30.494 "uuid": "00d2f518-4828-4ade-9f28-a168036da273", 00:29:30.494 "strip_size_kb": 0, 00:29:30.494 "state": "configuring", 00:29:30.494 "raid_level": "raid1", 00:29:30.494 "superblock": true, 00:29:30.494 "num_base_bdevs": 2, 00:29:30.494 "num_base_bdevs_discovered": 1, 00:29:30.494 "num_base_bdevs_operational": 2, 00:29:30.494 "base_bdevs_list": [ 00:29:30.494 { 00:29:30.494 "name": "BaseBdev1", 00:29:30.494 "uuid": "81ebc727-dc44-4792-9fde-1f28435823d5", 00:29:30.494 "is_configured": true, 00:29:30.494 "data_offset": 256, 00:29:30.494 "data_size": 7936 00:29:30.494 }, 00:29:30.494 { 00:29:30.494 "name": "BaseBdev2", 00:29:30.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:30.494 "is_configured": false, 00:29:30.494 "data_offset": 0, 00:29:30.494 "data_size": 0 00:29:30.494 } 00:29:30.494 ] 00:29:30.494 }' 00:29:30.494 22:58:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:30.494 22:58:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:31.060 22:58:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:31.318 [2024-07-15 22:58:16.066492] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:31.318 [2024-07-15 22:58:16.066530] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe04350 name Existed_Raid, state configuring 00:29:31.318 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:31.578 [2024-07-15 22:58:16.319188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:31.578 [2024-07-15 22:58:16.320653] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:31.578 [2024-07-15 22:58:16.320683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:31.578 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:31.836 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:31.836 "name": "Existed_Raid", 00:29:31.836 "uuid": "877e5376-4a8d-47c7-865c-cb276f08504e", 00:29:31.836 "strip_size_kb": 0, 00:29:31.836 "state": "configuring", 00:29:31.836 "raid_level": "raid1", 00:29:31.836 "superblock": true, 00:29:31.836 "num_base_bdevs": 2, 00:29:31.836 "num_base_bdevs_discovered": 1, 00:29:31.836 "num_base_bdevs_operational": 2, 00:29:31.836 "base_bdevs_list": [ 00:29:31.836 { 00:29:31.836 "name": "BaseBdev1", 00:29:31.836 "uuid": "81ebc727-dc44-4792-9fde-1f28435823d5", 00:29:31.836 "is_configured": true, 00:29:31.836 "data_offset": 256, 00:29:31.836 "data_size": 7936 00:29:31.836 }, 00:29:31.836 { 00:29:31.836 "name": "BaseBdev2", 00:29:31.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:31.836 "is_configured": false, 00:29:31.836 "data_offset": 0, 00:29:31.836 "data_size": 0 00:29:31.836 } 00:29:31.836 ] 00:29:31.836 }' 00:29:31.836 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:31.836 22:58:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:32.403 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:29:32.702 [2024-07-15 22:58:17.429748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:32.702 [2024-07-15 22:58:17.429880] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe06180 00:29:32.702 [2024-07-15 22:58:17.429893] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:32.702 [2024-07-15 22:58:17.429959] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe06150 00:29:32.702 [2024-07-15 22:58:17.430034] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe06180 00:29:32.702 [2024-07-15 22:58:17.430044] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe06180 00:29:32.702 [2024-07-15 22:58:17.430098] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:32.702 BaseBdev2 00:29:32.702 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:32.702 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:29:32.702 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:32.702 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:32.702 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:32.702 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:32.702 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:32.963 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:33.221 [ 00:29:33.221 { 00:29:33.221 "name": "BaseBdev2", 00:29:33.221 "aliases": [ 00:29:33.221 "0fba9176-376f-4e2d-ae1a-8fcfbc595a5f" 00:29:33.221 ], 00:29:33.221 "product_name": "Malloc disk", 00:29:33.221 "block_size": 4128, 00:29:33.221 "num_blocks": 8192, 00:29:33.221 "uuid": "0fba9176-376f-4e2d-ae1a-8fcfbc595a5f", 00:29:33.221 "md_size": 32, 00:29:33.221 "md_interleave": true, 00:29:33.221 "dif_type": 0, 00:29:33.221 "assigned_rate_limits": { 00:29:33.221 "rw_ios_per_sec": 0, 00:29:33.221 "rw_mbytes_per_sec": 0, 00:29:33.221 "r_mbytes_per_sec": 0, 00:29:33.221 "w_mbytes_per_sec": 0 00:29:33.221 }, 00:29:33.221 "claimed": true, 00:29:33.221 "claim_type": "exclusive_write", 00:29:33.221 "zoned": false, 00:29:33.221 "supported_io_types": { 00:29:33.221 "read": true, 00:29:33.221 "write": true, 00:29:33.221 "unmap": true, 00:29:33.221 "flush": true, 00:29:33.221 "reset": true, 00:29:33.221 "nvme_admin": false, 00:29:33.221 "nvme_io": false, 00:29:33.221 "nvme_io_md": false, 00:29:33.221 "write_zeroes": true, 00:29:33.221 "zcopy": true, 00:29:33.221 "get_zone_info": false, 00:29:33.221 "zone_management": false, 00:29:33.221 "zone_append": false, 00:29:33.221 "compare": false, 00:29:33.221 "compare_and_write": false, 00:29:33.221 "abort": true, 00:29:33.221 "seek_hole": false, 00:29:33.221 "seek_data": false, 00:29:33.221 "copy": true, 00:29:33.221 "nvme_iov_md": false 00:29:33.221 }, 00:29:33.221 "memory_domains": [ 00:29:33.221 { 00:29:33.221 "dma_device_id": "system", 00:29:33.221 "dma_device_type": 1 00:29:33.221 }, 00:29:33.221 { 00:29:33.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:33.221 "dma_device_type": 2 00:29:33.221 } 00:29:33.221 ], 00:29:33.221 "driver_specific": {} 00:29:33.221 } 00:29:33.221 ] 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.221 22:58:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:33.480 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:33.480 "name": "Existed_Raid", 00:29:33.480 "uuid": "877e5376-4a8d-47c7-865c-cb276f08504e", 00:29:33.480 "strip_size_kb": 0, 00:29:33.480 "state": "online", 00:29:33.480 "raid_level": "raid1", 00:29:33.480 "superblock": true, 00:29:33.480 "num_base_bdevs": 2, 00:29:33.480 "num_base_bdevs_discovered": 2, 00:29:33.480 "num_base_bdevs_operational": 2, 00:29:33.480 "base_bdevs_list": [ 00:29:33.480 { 00:29:33.480 "name": "BaseBdev1", 00:29:33.480 "uuid": "81ebc727-dc44-4792-9fde-1f28435823d5", 00:29:33.480 "is_configured": true, 00:29:33.480 "data_offset": 256, 00:29:33.480 "data_size": 7936 00:29:33.480 }, 00:29:33.480 { 00:29:33.480 "name": "BaseBdev2", 00:29:33.480 "uuid": "0fba9176-376f-4e2d-ae1a-8fcfbc595a5f", 00:29:33.480 "is_configured": true, 00:29:33.480 "data_offset": 256, 00:29:33.480 "data_size": 7936 00:29:33.480 } 00:29:33.480 ] 00:29:33.480 }' 00:29:33.480 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:33.480 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:34.046 22:58:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:34.304 [2024-07-15 22:58:19.022271] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:34.304 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:34.304 "name": "Existed_Raid", 00:29:34.304 "aliases": [ 00:29:34.304 "877e5376-4a8d-47c7-865c-cb276f08504e" 00:29:34.304 ], 00:29:34.304 "product_name": "Raid Volume", 00:29:34.304 "block_size": 4128, 00:29:34.304 "num_blocks": 7936, 00:29:34.304 "uuid": "877e5376-4a8d-47c7-865c-cb276f08504e", 00:29:34.304 "md_size": 32, 00:29:34.304 "md_interleave": true, 00:29:34.304 "dif_type": 0, 00:29:34.304 "assigned_rate_limits": { 00:29:34.304 "rw_ios_per_sec": 0, 00:29:34.304 "rw_mbytes_per_sec": 0, 00:29:34.304 "r_mbytes_per_sec": 0, 00:29:34.304 "w_mbytes_per_sec": 0 00:29:34.304 }, 00:29:34.304 "claimed": false, 00:29:34.304 "zoned": false, 00:29:34.304 "supported_io_types": { 00:29:34.304 "read": true, 00:29:34.304 "write": true, 00:29:34.304 "unmap": false, 00:29:34.304 "flush": false, 00:29:34.304 "reset": true, 00:29:34.304 "nvme_admin": false, 00:29:34.304 "nvme_io": false, 00:29:34.304 "nvme_io_md": false, 00:29:34.304 "write_zeroes": true, 00:29:34.304 "zcopy": false, 00:29:34.304 "get_zone_info": false, 00:29:34.304 "zone_management": false, 00:29:34.304 "zone_append": false, 00:29:34.304 "compare": false, 00:29:34.304 "compare_and_write": false, 00:29:34.304 "abort": false, 00:29:34.304 "seek_hole": false, 00:29:34.304 "seek_data": false, 00:29:34.304 "copy": false, 00:29:34.304 "nvme_iov_md": false 00:29:34.304 }, 00:29:34.304 "memory_domains": [ 00:29:34.304 { 00:29:34.304 "dma_device_id": "system", 00:29:34.304 "dma_device_type": 1 00:29:34.304 }, 00:29:34.304 { 00:29:34.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.304 "dma_device_type": 2 00:29:34.304 }, 00:29:34.304 { 00:29:34.304 "dma_device_id": "system", 00:29:34.304 "dma_device_type": 1 00:29:34.304 }, 00:29:34.304 { 00:29:34.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.304 "dma_device_type": 2 00:29:34.304 } 00:29:34.304 ], 00:29:34.304 "driver_specific": { 00:29:34.304 "raid": { 00:29:34.304 "uuid": "877e5376-4a8d-47c7-865c-cb276f08504e", 00:29:34.304 "strip_size_kb": 0, 00:29:34.304 "state": "online", 00:29:34.304 "raid_level": "raid1", 00:29:34.304 "superblock": true, 00:29:34.304 "num_base_bdevs": 2, 00:29:34.304 "num_base_bdevs_discovered": 2, 00:29:34.304 "num_base_bdevs_operational": 2, 00:29:34.304 "base_bdevs_list": [ 00:29:34.304 { 00:29:34.304 "name": "BaseBdev1", 00:29:34.304 "uuid": "81ebc727-dc44-4792-9fde-1f28435823d5", 00:29:34.304 "is_configured": true, 00:29:34.304 "data_offset": 256, 00:29:34.304 "data_size": 7936 00:29:34.304 }, 00:29:34.304 { 00:29:34.304 "name": "BaseBdev2", 00:29:34.304 "uuid": "0fba9176-376f-4e2d-ae1a-8fcfbc595a5f", 00:29:34.304 "is_configured": true, 00:29:34.304 "data_offset": 256, 00:29:34.304 "data_size": 7936 00:29:34.304 } 00:29:34.304 ] 00:29:34.304 } 00:29:34.304 } 00:29:34.304 }' 00:29:34.304 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:34.304 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:34.304 BaseBdev2' 00:29:34.304 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:34.304 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:34.304 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:34.562 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:34.562 "name": "BaseBdev1", 00:29:34.562 "aliases": [ 00:29:34.562 "81ebc727-dc44-4792-9fde-1f28435823d5" 00:29:34.562 ], 00:29:34.562 "product_name": "Malloc disk", 00:29:34.562 "block_size": 4128, 00:29:34.562 "num_blocks": 8192, 00:29:34.562 "uuid": "81ebc727-dc44-4792-9fde-1f28435823d5", 00:29:34.562 "md_size": 32, 00:29:34.562 "md_interleave": true, 00:29:34.562 "dif_type": 0, 00:29:34.562 "assigned_rate_limits": { 00:29:34.562 "rw_ios_per_sec": 0, 00:29:34.562 "rw_mbytes_per_sec": 0, 00:29:34.562 "r_mbytes_per_sec": 0, 00:29:34.562 "w_mbytes_per_sec": 0 00:29:34.562 }, 00:29:34.562 "claimed": true, 00:29:34.562 "claim_type": "exclusive_write", 00:29:34.563 "zoned": false, 00:29:34.563 "supported_io_types": { 00:29:34.563 "read": true, 00:29:34.563 "write": true, 00:29:34.563 "unmap": true, 00:29:34.563 "flush": true, 00:29:34.563 "reset": true, 00:29:34.563 "nvme_admin": false, 00:29:34.563 "nvme_io": false, 00:29:34.563 "nvme_io_md": false, 00:29:34.563 "write_zeroes": true, 00:29:34.563 "zcopy": true, 00:29:34.563 "get_zone_info": false, 00:29:34.563 "zone_management": false, 00:29:34.563 "zone_append": false, 00:29:34.563 "compare": false, 00:29:34.563 "compare_and_write": false, 00:29:34.563 "abort": true, 00:29:34.563 "seek_hole": false, 00:29:34.563 "seek_data": false, 00:29:34.563 "copy": true, 00:29:34.563 "nvme_iov_md": false 00:29:34.563 }, 00:29:34.563 "memory_domains": [ 00:29:34.563 { 00:29:34.563 "dma_device_id": "system", 00:29:34.563 "dma_device_type": 1 00:29:34.563 }, 00:29:34.563 { 00:29:34.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.563 "dma_device_type": 2 00:29:34.563 } 00:29:34.563 ], 00:29:34.563 "driver_specific": {} 00:29:34.563 }' 00:29:34.563 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:34.563 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:34.563 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:34.563 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:34.821 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:35.079 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:35.079 "name": "BaseBdev2", 00:29:35.079 "aliases": [ 00:29:35.079 "0fba9176-376f-4e2d-ae1a-8fcfbc595a5f" 00:29:35.079 ], 00:29:35.079 "product_name": "Malloc disk", 00:29:35.079 "block_size": 4128, 00:29:35.079 "num_blocks": 8192, 00:29:35.079 "uuid": "0fba9176-376f-4e2d-ae1a-8fcfbc595a5f", 00:29:35.079 "md_size": 32, 00:29:35.079 "md_interleave": true, 00:29:35.079 "dif_type": 0, 00:29:35.079 "assigned_rate_limits": { 00:29:35.079 "rw_ios_per_sec": 0, 00:29:35.079 "rw_mbytes_per_sec": 0, 00:29:35.079 "r_mbytes_per_sec": 0, 00:29:35.079 "w_mbytes_per_sec": 0 00:29:35.079 }, 00:29:35.079 "claimed": true, 00:29:35.079 "claim_type": "exclusive_write", 00:29:35.079 "zoned": false, 00:29:35.079 "supported_io_types": { 00:29:35.079 "read": true, 00:29:35.079 "write": true, 00:29:35.079 "unmap": true, 00:29:35.079 "flush": true, 00:29:35.079 "reset": true, 00:29:35.079 "nvme_admin": false, 00:29:35.079 "nvme_io": false, 00:29:35.079 "nvme_io_md": false, 00:29:35.079 "write_zeroes": true, 00:29:35.079 "zcopy": true, 00:29:35.079 "get_zone_info": false, 00:29:35.079 "zone_management": false, 00:29:35.079 "zone_append": false, 00:29:35.079 "compare": false, 00:29:35.079 "compare_and_write": false, 00:29:35.079 "abort": true, 00:29:35.079 "seek_hole": false, 00:29:35.079 "seek_data": false, 00:29:35.079 "copy": true, 00:29:35.079 "nvme_iov_md": false 00:29:35.079 }, 00:29:35.079 "memory_domains": [ 00:29:35.079 { 00:29:35.079 "dma_device_id": "system", 00:29:35.079 "dma_device_type": 1 00:29:35.079 }, 00:29:35.079 { 00:29:35.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.079 "dma_device_type": 2 00:29:35.079 } 00:29:35.079 ], 00:29:35.079 "driver_specific": {} 00:29:35.079 }' 00:29:35.079 22:58:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:35.336 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:35.594 [2024-07-15 22:58:20.465882] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:35.594 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.851 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:35.851 "name": "Existed_Raid", 00:29:35.851 "uuid": "877e5376-4a8d-47c7-865c-cb276f08504e", 00:29:35.851 "strip_size_kb": 0, 00:29:35.851 "state": "online", 00:29:35.851 "raid_level": "raid1", 00:29:35.851 "superblock": true, 00:29:35.851 "num_base_bdevs": 2, 00:29:35.851 "num_base_bdevs_discovered": 1, 00:29:35.851 "num_base_bdevs_operational": 1, 00:29:35.851 "base_bdevs_list": [ 00:29:35.851 { 00:29:35.851 "name": null, 00:29:35.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:35.851 "is_configured": false, 00:29:35.851 "data_offset": 256, 00:29:35.851 "data_size": 7936 00:29:35.851 }, 00:29:35.851 { 00:29:35.851 "name": "BaseBdev2", 00:29:35.851 "uuid": "0fba9176-376f-4e2d-ae1a-8fcfbc595a5f", 00:29:35.851 "is_configured": true, 00:29:35.851 "data_offset": 256, 00:29:35.851 "data_size": 7936 00:29:35.851 } 00:29:35.851 ] 00:29:35.851 }' 00:29:35.851 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:35.851 22:58:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:36.784 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:36.784 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:36.784 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.784 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:37.042 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:37.042 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:37.042 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:37.042 [2024-07-15 22:58:21.918806] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:37.042 [2024-07-15 22:58:21.918890] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:37.042 [2024-07-15 22:58:21.930165] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:37.042 [2024-07-15 22:58:21.930197] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:37.042 [2024-07-15 22:58:21.930208] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe06180 name Existed_Raid, state offline 00:29:37.300 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:37.300 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:37.300 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.300 22:58:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2858865 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2858865 ']' 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2858865 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2858865 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2858865' 00:29:37.559 killing process with pid 2858865 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2858865 00:29:37.559 [2024-07-15 22:58:22.258944] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:37.559 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2858865 00:29:37.559 [2024-07-15 22:58:22.259816] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:37.818 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:29:37.818 00:29:37.818 real 0m10.826s 00:29:37.818 user 0m19.213s 00:29:37.818 sys 0m2.033s 00:29:37.818 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:37.818 22:58:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:37.818 ************************************ 00:29:37.818 END TEST raid_state_function_test_sb_md_interleaved 00:29:37.818 ************************************ 00:29:37.818 22:58:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:37.818 22:58:22 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:29:37.818 22:58:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:29:37.818 22:58:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:37.818 22:58:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:37.818 ************************************ 00:29:37.818 START TEST raid_superblock_test_md_interleaved 00:29:37.818 ************************************ 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2860423 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2860423 /var/tmp/spdk-raid.sock 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2860423 ']' 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:37.818 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:37.818 22:58:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:37.818 [2024-07-15 22:58:22.632641] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:29:37.818 [2024-07-15 22:58:22.632715] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2860423 ] 00:29:38.077 [2024-07-15 22:58:22.763074] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:38.077 [2024-07-15 22:58:22.867656] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:38.077 [2024-07-15 22:58:22.927712] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:38.077 [2024-07-15 22:58:22.927747] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:39.014 malloc1 00:29:39.014 22:58:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:39.273 [2024-07-15 22:58:24.049164] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:39.273 [2024-07-15 22:58:24.049215] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:39.273 [2024-07-15 22:58:24.049236] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb184e0 00:29:39.273 [2024-07-15 22:58:24.049248] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:39.273 [2024-07-15 22:58:24.050684] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:39.273 [2024-07-15 22:58:24.050717] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:39.273 pt1 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:39.273 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:39.532 malloc2 00:29:39.532 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:39.792 [2024-07-15 22:58:24.547362] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:39.792 [2024-07-15 22:58:24.547405] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:39.792 [2024-07-15 22:58:24.547422] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xafd570 00:29:39.792 [2024-07-15 22:58:24.547434] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:39.792 [2024-07-15 22:58:24.548782] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:39.792 [2024-07-15 22:58:24.548807] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:39.792 pt2 00:29:39.792 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:39.792 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:39.792 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:40.051 [2024-07-15 22:58:24.796030] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:40.051 [2024-07-15 22:58:24.797337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:40.051 [2024-07-15 22:58:24.797480] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xafef20 00:29:40.051 [2024-07-15 22:58:24.797493] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:40.051 [2024-07-15 22:58:24.797550] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x97b050 00:29:40.051 [2024-07-15 22:58:24.797634] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xafef20 00:29:40.051 [2024-07-15 22:58:24.797644] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xafef20 00:29:40.051 [2024-07-15 22:58:24.797697] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.051 22:58:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:40.320 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:40.320 "name": "raid_bdev1", 00:29:40.320 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:40.320 "strip_size_kb": 0, 00:29:40.320 "state": "online", 00:29:40.320 "raid_level": "raid1", 00:29:40.320 "superblock": true, 00:29:40.320 "num_base_bdevs": 2, 00:29:40.320 "num_base_bdevs_discovered": 2, 00:29:40.320 "num_base_bdevs_operational": 2, 00:29:40.320 "base_bdevs_list": [ 00:29:40.320 { 00:29:40.320 "name": "pt1", 00:29:40.320 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:40.320 "is_configured": true, 00:29:40.320 "data_offset": 256, 00:29:40.320 "data_size": 7936 00:29:40.320 }, 00:29:40.320 { 00:29:40.320 "name": "pt2", 00:29:40.320 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:40.320 "is_configured": true, 00:29:40.320 "data_offset": 256, 00:29:40.320 "data_size": 7936 00:29:40.320 } 00:29:40.320 ] 00:29:40.320 }' 00:29:40.320 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:40.320 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:40.886 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:41.145 [2024-07-15 22:58:25.907236] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:41.145 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:41.145 "name": "raid_bdev1", 00:29:41.145 "aliases": [ 00:29:41.145 "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2" 00:29:41.145 ], 00:29:41.145 "product_name": "Raid Volume", 00:29:41.145 "block_size": 4128, 00:29:41.145 "num_blocks": 7936, 00:29:41.145 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:41.145 "md_size": 32, 00:29:41.145 "md_interleave": true, 00:29:41.145 "dif_type": 0, 00:29:41.145 "assigned_rate_limits": { 00:29:41.145 "rw_ios_per_sec": 0, 00:29:41.145 "rw_mbytes_per_sec": 0, 00:29:41.145 "r_mbytes_per_sec": 0, 00:29:41.145 "w_mbytes_per_sec": 0 00:29:41.145 }, 00:29:41.145 "claimed": false, 00:29:41.145 "zoned": false, 00:29:41.145 "supported_io_types": { 00:29:41.145 "read": true, 00:29:41.145 "write": true, 00:29:41.145 "unmap": false, 00:29:41.145 "flush": false, 00:29:41.145 "reset": true, 00:29:41.145 "nvme_admin": false, 00:29:41.145 "nvme_io": false, 00:29:41.145 "nvme_io_md": false, 00:29:41.145 "write_zeroes": true, 00:29:41.145 "zcopy": false, 00:29:41.145 "get_zone_info": false, 00:29:41.145 "zone_management": false, 00:29:41.145 "zone_append": false, 00:29:41.145 "compare": false, 00:29:41.145 "compare_and_write": false, 00:29:41.145 "abort": false, 00:29:41.145 "seek_hole": false, 00:29:41.145 "seek_data": false, 00:29:41.145 "copy": false, 00:29:41.145 "nvme_iov_md": false 00:29:41.145 }, 00:29:41.145 "memory_domains": [ 00:29:41.145 { 00:29:41.145 "dma_device_id": "system", 00:29:41.145 "dma_device_type": 1 00:29:41.145 }, 00:29:41.145 { 00:29:41.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.145 "dma_device_type": 2 00:29:41.145 }, 00:29:41.145 { 00:29:41.145 "dma_device_id": "system", 00:29:41.145 "dma_device_type": 1 00:29:41.145 }, 00:29:41.145 { 00:29:41.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.145 "dma_device_type": 2 00:29:41.145 } 00:29:41.145 ], 00:29:41.145 "driver_specific": { 00:29:41.145 "raid": { 00:29:41.145 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:41.145 "strip_size_kb": 0, 00:29:41.145 "state": "online", 00:29:41.145 "raid_level": "raid1", 00:29:41.145 "superblock": true, 00:29:41.145 "num_base_bdevs": 2, 00:29:41.145 "num_base_bdevs_discovered": 2, 00:29:41.145 "num_base_bdevs_operational": 2, 00:29:41.145 "base_bdevs_list": [ 00:29:41.145 { 00:29:41.145 "name": "pt1", 00:29:41.145 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:41.145 "is_configured": true, 00:29:41.145 "data_offset": 256, 00:29:41.145 "data_size": 7936 00:29:41.145 }, 00:29:41.145 { 00:29:41.145 "name": "pt2", 00:29:41.145 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:41.145 "is_configured": true, 00:29:41.145 "data_offset": 256, 00:29:41.145 "data_size": 7936 00:29:41.145 } 00:29:41.145 ] 00:29:41.145 } 00:29:41.145 } 00:29:41.145 }' 00:29:41.145 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:41.145 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:41.145 pt2' 00:29:41.145 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:41.145 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:41.145 22:58:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:41.404 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:41.404 "name": "pt1", 00:29:41.404 "aliases": [ 00:29:41.404 "00000000-0000-0000-0000-000000000001" 00:29:41.404 ], 00:29:41.404 "product_name": "passthru", 00:29:41.404 "block_size": 4128, 00:29:41.404 "num_blocks": 8192, 00:29:41.404 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:41.404 "md_size": 32, 00:29:41.404 "md_interleave": true, 00:29:41.404 "dif_type": 0, 00:29:41.404 "assigned_rate_limits": { 00:29:41.404 "rw_ios_per_sec": 0, 00:29:41.404 "rw_mbytes_per_sec": 0, 00:29:41.404 "r_mbytes_per_sec": 0, 00:29:41.404 "w_mbytes_per_sec": 0 00:29:41.404 }, 00:29:41.404 "claimed": true, 00:29:41.404 "claim_type": "exclusive_write", 00:29:41.404 "zoned": false, 00:29:41.404 "supported_io_types": { 00:29:41.404 "read": true, 00:29:41.404 "write": true, 00:29:41.404 "unmap": true, 00:29:41.404 "flush": true, 00:29:41.404 "reset": true, 00:29:41.404 "nvme_admin": false, 00:29:41.404 "nvme_io": false, 00:29:41.404 "nvme_io_md": false, 00:29:41.404 "write_zeroes": true, 00:29:41.404 "zcopy": true, 00:29:41.404 "get_zone_info": false, 00:29:41.404 "zone_management": false, 00:29:41.404 "zone_append": false, 00:29:41.404 "compare": false, 00:29:41.404 "compare_and_write": false, 00:29:41.404 "abort": true, 00:29:41.404 "seek_hole": false, 00:29:41.404 "seek_data": false, 00:29:41.404 "copy": true, 00:29:41.404 "nvme_iov_md": false 00:29:41.404 }, 00:29:41.404 "memory_domains": [ 00:29:41.404 { 00:29:41.404 "dma_device_id": "system", 00:29:41.404 "dma_device_type": 1 00:29:41.404 }, 00:29:41.404 { 00:29:41.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.404 "dma_device_type": 2 00:29:41.404 } 00:29:41.404 ], 00:29:41.404 "driver_specific": { 00:29:41.404 "passthru": { 00:29:41.404 "name": "pt1", 00:29:41.404 "base_bdev_name": "malloc1" 00:29:41.404 } 00:29:41.404 } 00:29:41.404 }' 00:29:41.404 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.404 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.404 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:41.404 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.664 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.664 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:41.664 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.664 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.664 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:41.664 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.664 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.923 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:41.923 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:41.923 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:41.923 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:41.923 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:41.923 "name": "pt2", 00:29:41.923 "aliases": [ 00:29:41.923 "00000000-0000-0000-0000-000000000002" 00:29:41.923 ], 00:29:41.923 "product_name": "passthru", 00:29:41.923 "block_size": 4128, 00:29:41.923 "num_blocks": 8192, 00:29:41.923 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:41.923 "md_size": 32, 00:29:41.923 "md_interleave": true, 00:29:41.923 "dif_type": 0, 00:29:41.923 "assigned_rate_limits": { 00:29:41.923 "rw_ios_per_sec": 0, 00:29:41.923 "rw_mbytes_per_sec": 0, 00:29:41.923 "r_mbytes_per_sec": 0, 00:29:41.923 "w_mbytes_per_sec": 0 00:29:41.923 }, 00:29:41.923 "claimed": true, 00:29:41.923 "claim_type": "exclusive_write", 00:29:41.923 "zoned": false, 00:29:41.923 "supported_io_types": { 00:29:41.923 "read": true, 00:29:41.923 "write": true, 00:29:41.923 "unmap": true, 00:29:41.923 "flush": true, 00:29:41.923 "reset": true, 00:29:41.923 "nvme_admin": false, 00:29:41.923 "nvme_io": false, 00:29:41.923 "nvme_io_md": false, 00:29:41.923 "write_zeroes": true, 00:29:41.923 "zcopy": true, 00:29:41.923 "get_zone_info": false, 00:29:41.923 "zone_management": false, 00:29:41.923 "zone_append": false, 00:29:41.923 "compare": false, 00:29:41.923 "compare_and_write": false, 00:29:41.924 "abort": true, 00:29:41.924 "seek_hole": false, 00:29:41.924 "seek_data": false, 00:29:41.924 "copy": true, 00:29:41.924 "nvme_iov_md": false 00:29:41.924 }, 00:29:41.924 "memory_domains": [ 00:29:41.924 { 00:29:41.924 "dma_device_id": "system", 00:29:41.924 "dma_device_type": 1 00:29:41.924 }, 00:29:41.924 { 00:29:41.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.924 "dma_device_type": 2 00:29:41.924 } 00:29:41.924 ], 00:29:41.924 "driver_specific": { 00:29:41.924 "passthru": { 00:29:41.924 "name": "pt2", 00:29:41.924 "base_bdev_name": "malloc2" 00:29:41.924 } 00:29:41.924 } 00:29:41.924 }' 00:29:41.924 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:42.182 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:42.182 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:42.182 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:42.182 22:58:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:42.182 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:42.182 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:42.182 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:42.442 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:42.442 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:42.442 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:42.442 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:42.442 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:42.442 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:42.700 [2024-07-15 22:58:27.403178] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:42.700 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2 00:29:42.700 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2 ']' 00:29:42.700 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:42.960 [2024-07-15 22:58:27.651592] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:42.960 [2024-07-15 22:58:27.651619] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:42.960 [2024-07-15 22:58:27.651675] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:42.960 [2024-07-15 22:58:27.651731] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:42.960 [2024-07-15 22:58:27.651742] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xafef20 name raid_bdev1, state offline 00:29:42.960 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:42.960 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.219 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:43.219 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:43.219 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:43.219 22:58:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:43.479 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:43.479 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:43.738 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:43.738 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:43.997 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:43.997 [2024-07-15 22:58:28.886819] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:43.997 [2024-07-15 22:58:28.888236] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:43.997 [2024-07-15 22:58:28.888295] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:43.997 [2024-07-15 22:58:28.888337] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:43.997 [2024-07-15 22:58:28.888356] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:43.997 [2024-07-15 22:58:28.888366] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb09260 name raid_bdev1, state configuring 00:29:43.997 request: 00:29:43.997 { 00:29:43.997 "name": "raid_bdev1", 00:29:43.997 "raid_level": "raid1", 00:29:43.997 "base_bdevs": [ 00:29:43.997 "malloc1", 00:29:43.997 "malloc2" 00:29:43.997 ], 00:29:43.997 "superblock": false, 00:29:43.997 "method": "bdev_raid_create", 00:29:43.997 "req_id": 1 00:29:43.997 } 00:29:43.997 Got JSON-RPC error response 00:29:43.997 response: 00:29:43.997 { 00:29:43.997 "code": -17, 00:29:43.997 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:43.997 } 00:29:44.256 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:44.256 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:44.256 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:44.256 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:44.256 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.256 22:58:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:44.256 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:44.256 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:44.256 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:44.515 [2024-07-15 22:58:29.380060] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:44.515 [2024-07-15 22:58:29.380100] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:44.515 [2024-07-15 22:58:29.380116] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb00000 00:29:44.515 [2024-07-15 22:58:29.380128] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:44.515 [2024-07-15 22:58:29.381566] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:44.515 [2024-07-15 22:58:29.381594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:44.515 [2024-07-15 22:58:29.381639] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:44.515 [2024-07-15 22:58:29.381669] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:44.515 pt1 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.515 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:44.773 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:44.773 "name": "raid_bdev1", 00:29:44.773 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:44.773 "strip_size_kb": 0, 00:29:44.773 "state": "configuring", 00:29:44.773 "raid_level": "raid1", 00:29:44.773 "superblock": true, 00:29:44.773 "num_base_bdevs": 2, 00:29:44.773 "num_base_bdevs_discovered": 1, 00:29:44.773 "num_base_bdevs_operational": 2, 00:29:44.773 "base_bdevs_list": [ 00:29:44.773 { 00:29:44.773 "name": "pt1", 00:29:44.773 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:44.773 "is_configured": true, 00:29:44.773 "data_offset": 256, 00:29:44.773 "data_size": 7936 00:29:44.773 }, 00:29:44.773 { 00:29:44.773 "name": null, 00:29:44.773 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:44.773 "is_configured": false, 00:29:44.773 "data_offset": 256, 00:29:44.773 "data_size": 7936 00:29:44.773 } 00:29:44.773 ] 00:29:44.773 }' 00:29:44.773 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:44.773 22:58:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:45.705 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:45.705 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:45.705 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:45.705 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:45.705 [2024-07-15 22:58:30.495062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:45.705 [2024-07-15 22:58:30.495120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:45.705 [2024-07-15 22:58:30.495141] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb02270 00:29:45.706 [2024-07-15 22:58:30.495154] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:45.706 [2024-07-15 22:58:30.495347] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:45.706 [2024-07-15 22:58:30.495363] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:45.706 [2024-07-15 22:58:30.495414] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:45.706 [2024-07-15 22:58:30.495434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:45.706 [2024-07-15 22:58:30.495523] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x97bc10 00:29:45.706 [2024-07-15 22:58:30.495534] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:45.706 [2024-07-15 22:58:30.495593] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xafdd40 00:29:45.706 [2024-07-15 22:58:30.495667] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x97bc10 00:29:45.706 [2024-07-15 22:58:30.495676] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x97bc10 00:29:45.706 [2024-07-15 22:58:30.495735] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:45.706 pt2 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.706 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.964 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.964 "name": "raid_bdev1", 00:29:45.964 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:45.964 "strip_size_kb": 0, 00:29:45.964 "state": "online", 00:29:45.964 "raid_level": "raid1", 00:29:45.964 "superblock": true, 00:29:45.964 "num_base_bdevs": 2, 00:29:45.964 "num_base_bdevs_discovered": 2, 00:29:45.964 "num_base_bdevs_operational": 2, 00:29:45.964 "base_bdevs_list": [ 00:29:45.964 { 00:29:45.964 "name": "pt1", 00:29:45.964 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:45.964 "is_configured": true, 00:29:45.964 "data_offset": 256, 00:29:45.964 "data_size": 7936 00:29:45.964 }, 00:29:45.964 { 00:29:45.964 "name": "pt2", 00:29:45.964 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:45.964 "is_configured": true, 00:29:45.964 "data_offset": 256, 00:29:45.964 "data_size": 7936 00:29:45.964 } 00:29:45.964 ] 00:29:45.964 }' 00:29:45.964 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.964 22:58:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:46.529 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:46.788 [2024-07-15 22:58:31.594228] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:46.788 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:46.788 "name": "raid_bdev1", 00:29:46.788 "aliases": [ 00:29:46.788 "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2" 00:29:46.788 ], 00:29:46.788 "product_name": "Raid Volume", 00:29:46.788 "block_size": 4128, 00:29:46.788 "num_blocks": 7936, 00:29:46.788 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:46.788 "md_size": 32, 00:29:46.788 "md_interleave": true, 00:29:46.788 "dif_type": 0, 00:29:46.788 "assigned_rate_limits": { 00:29:46.788 "rw_ios_per_sec": 0, 00:29:46.788 "rw_mbytes_per_sec": 0, 00:29:46.788 "r_mbytes_per_sec": 0, 00:29:46.788 "w_mbytes_per_sec": 0 00:29:46.788 }, 00:29:46.788 "claimed": false, 00:29:46.788 "zoned": false, 00:29:46.788 "supported_io_types": { 00:29:46.788 "read": true, 00:29:46.788 "write": true, 00:29:46.788 "unmap": false, 00:29:46.788 "flush": false, 00:29:46.788 "reset": true, 00:29:46.788 "nvme_admin": false, 00:29:46.788 "nvme_io": false, 00:29:46.788 "nvme_io_md": false, 00:29:46.788 "write_zeroes": true, 00:29:46.788 "zcopy": false, 00:29:46.788 "get_zone_info": false, 00:29:46.788 "zone_management": false, 00:29:46.788 "zone_append": false, 00:29:46.788 "compare": false, 00:29:46.788 "compare_and_write": false, 00:29:46.788 "abort": false, 00:29:46.788 "seek_hole": false, 00:29:46.788 "seek_data": false, 00:29:46.788 "copy": false, 00:29:46.788 "nvme_iov_md": false 00:29:46.788 }, 00:29:46.788 "memory_domains": [ 00:29:46.788 { 00:29:46.788 "dma_device_id": "system", 00:29:46.788 "dma_device_type": 1 00:29:46.788 }, 00:29:46.788 { 00:29:46.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:46.788 "dma_device_type": 2 00:29:46.788 }, 00:29:46.788 { 00:29:46.788 "dma_device_id": "system", 00:29:46.788 "dma_device_type": 1 00:29:46.788 }, 00:29:46.788 { 00:29:46.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:46.788 "dma_device_type": 2 00:29:46.788 } 00:29:46.788 ], 00:29:46.788 "driver_specific": { 00:29:46.788 "raid": { 00:29:46.788 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:46.788 "strip_size_kb": 0, 00:29:46.788 "state": "online", 00:29:46.788 "raid_level": "raid1", 00:29:46.788 "superblock": true, 00:29:46.788 "num_base_bdevs": 2, 00:29:46.788 "num_base_bdevs_discovered": 2, 00:29:46.788 "num_base_bdevs_operational": 2, 00:29:46.788 "base_bdevs_list": [ 00:29:46.788 { 00:29:46.788 "name": "pt1", 00:29:46.788 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:46.788 "is_configured": true, 00:29:46.788 "data_offset": 256, 00:29:46.788 "data_size": 7936 00:29:46.788 }, 00:29:46.788 { 00:29:46.788 "name": "pt2", 00:29:46.788 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:46.788 "is_configured": true, 00:29:46.788 "data_offset": 256, 00:29:46.788 "data_size": 7936 00:29:46.788 } 00:29:46.788 ] 00:29:46.788 } 00:29:46.788 } 00:29:46.788 }' 00:29:46.788 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:46.788 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:46.788 pt2' 00:29:46.788 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:46.788 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:46.788 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:47.046 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:47.046 "name": "pt1", 00:29:47.046 "aliases": [ 00:29:47.046 "00000000-0000-0000-0000-000000000001" 00:29:47.046 ], 00:29:47.046 "product_name": "passthru", 00:29:47.046 "block_size": 4128, 00:29:47.046 "num_blocks": 8192, 00:29:47.046 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:47.046 "md_size": 32, 00:29:47.046 "md_interleave": true, 00:29:47.046 "dif_type": 0, 00:29:47.046 "assigned_rate_limits": { 00:29:47.046 "rw_ios_per_sec": 0, 00:29:47.046 "rw_mbytes_per_sec": 0, 00:29:47.046 "r_mbytes_per_sec": 0, 00:29:47.046 "w_mbytes_per_sec": 0 00:29:47.046 }, 00:29:47.046 "claimed": true, 00:29:47.046 "claim_type": "exclusive_write", 00:29:47.046 "zoned": false, 00:29:47.046 "supported_io_types": { 00:29:47.046 "read": true, 00:29:47.046 "write": true, 00:29:47.046 "unmap": true, 00:29:47.046 "flush": true, 00:29:47.046 "reset": true, 00:29:47.046 "nvme_admin": false, 00:29:47.046 "nvme_io": false, 00:29:47.046 "nvme_io_md": false, 00:29:47.046 "write_zeroes": true, 00:29:47.046 "zcopy": true, 00:29:47.046 "get_zone_info": false, 00:29:47.046 "zone_management": false, 00:29:47.046 "zone_append": false, 00:29:47.046 "compare": false, 00:29:47.046 "compare_and_write": false, 00:29:47.046 "abort": true, 00:29:47.046 "seek_hole": false, 00:29:47.046 "seek_data": false, 00:29:47.046 "copy": true, 00:29:47.046 "nvme_iov_md": false 00:29:47.046 }, 00:29:47.046 "memory_domains": [ 00:29:47.046 { 00:29:47.046 "dma_device_id": "system", 00:29:47.046 "dma_device_type": 1 00:29:47.046 }, 00:29:47.046 { 00:29:47.046 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:47.046 "dma_device_type": 2 00:29:47.046 } 00:29:47.046 ], 00:29:47.046 "driver_specific": { 00:29:47.046 "passthru": { 00:29:47.046 "name": "pt1", 00:29:47.046 "base_bdev_name": "malloc1" 00:29:47.046 } 00:29:47.046 } 00:29:47.046 }' 00:29:47.046 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:47.046 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:47.304 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:47.304 22:58:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:47.304 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:47.304 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:47.304 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:47.304 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:47.304 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:47.304 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:47.562 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:47.562 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:47.562 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:47.562 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:47.562 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:47.821 "name": "pt2", 00:29:47.821 "aliases": [ 00:29:47.821 "00000000-0000-0000-0000-000000000002" 00:29:47.821 ], 00:29:47.821 "product_name": "passthru", 00:29:47.821 "block_size": 4128, 00:29:47.821 "num_blocks": 8192, 00:29:47.821 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:47.821 "md_size": 32, 00:29:47.821 "md_interleave": true, 00:29:47.821 "dif_type": 0, 00:29:47.821 "assigned_rate_limits": { 00:29:47.821 "rw_ios_per_sec": 0, 00:29:47.821 "rw_mbytes_per_sec": 0, 00:29:47.821 "r_mbytes_per_sec": 0, 00:29:47.821 "w_mbytes_per_sec": 0 00:29:47.821 }, 00:29:47.821 "claimed": true, 00:29:47.821 "claim_type": "exclusive_write", 00:29:47.821 "zoned": false, 00:29:47.821 "supported_io_types": { 00:29:47.821 "read": true, 00:29:47.821 "write": true, 00:29:47.821 "unmap": true, 00:29:47.821 "flush": true, 00:29:47.821 "reset": true, 00:29:47.821 "nvme_admin": false, 00:29:47.821 "nvme_io": false, 00:29:47.821 "nvme_io_md": false, 00:29:47.821 "write_zeroes": true, 00:29:47.821 "zcopy": true, 00:29:47.821 "get_zone_info": false, 00:29:47.821 "zone_management": false, 00:29:47.821 "zone_append": false, 00:29:47.821 "compare": false, 00:29:47.821 "compare_and_write": false, 00:29:47.821 "abort": true, 00:29:47.821 "seek_hole": false, 00:29:47.821 "seek_data": false, 00:29:47.821 "copy": true, 00:29:47.821 "nvme_iov_md": false 00:29:47.821 }, 00:29:47.821 "memory_domains": [ 00:29:47.821 { 00:29:47.821 "dma_device_id": "system", 00:29:47.821 "dma_device_type": 1 00:29:47.821 }, 00:29:47.821 { 00:29:47.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:47.821 "dma_device_type": 2 00:29:47.821 } 00:29:47.821 ], 00:29:47.821 "driver_specific": { 00:29:47.821 "passthru": { 00:29:47.821 "name": "pt2", 00:29:47.821 "base_bdev_name": "malloc2" 00:29:47.821 } 00:29:47.821 } 00:29:47.821 }' 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:47.821 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:48.079 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:48.079 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:48.079 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:48.079 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:48.079 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:48.079 22:58:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:48.338 [2024-07-15 22:58:33.074141] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:48.338 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2 '!=' 8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2 ']' 00:29:48.338 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:48.338 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:48.338 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:48.338 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:48.596 [2024-07-15 22:58:33.326582] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.596 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:48.854 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:48.854 "name": "raid_bdev1", 00:29:48.854 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:48.854 "strip_size_kb": 0, 00:29:48.854 "state": "online", 00:29:48.854 "raid_level": "raid1", 00:29:48.854 "superblock": true, 00:29:48.854 "num_base_bdevs": 2, 00:29:48.854 "num_base_bdevs_discovered": 1, 00:29:48.854 "num_base_bdevs_operational": 1, 00:29:48.854 "base_bdevs_list": [ 00:29:48.854 { 00:29:48.854 "name": null, 00:29:48.854 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:48.854 "is_configured": false, 00:29:48.854 "data_offset": 256, 00:29:48.854 "data_size": 7936 00:29:48.854 }, 00:29:48.854 { 00:29:48.854 "name": "pt2", 00:29:48.854 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:48.854 "is_configured": true, 00:29:48.854 "data_offset": 256, 00:29:48.854 "data_size": 7936 00:29:48.854 } 00:29:48.854 ] 00:29:48.854 }' 00:29:48.854 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:48.854 22:58:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:49.820 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:49.820 [2024-07-15 22:58:34.698256] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:49.820 [2024-07-15 22:58:34.698285] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:49.820 [2024-07-15 22:58:34.698340] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:49.820 [2024-07-15 22:58:34.698386] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:49.820 [2024-07-15 22:58:34.698398] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x97bc10 name raid_bdev1, state offline 00:29:49.820 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:49.820 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:50.078 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:50.078 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:50.078 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:50.078 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:50.078 22:58:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:50.646 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:50.646 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:50.646 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:50.646 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:50.646 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:29:50.646 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:50.904 [2024-07-15 22:58:35.708881] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:50.904 [2024-07-15 22:58:35.708931] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:50.904 [2024-07-15 22:58:35.708948] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb009f0 00:29:50.904 [2024-07-15 22:58:35.708961] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:50.904 [2024-07-15 22:58:35.710414] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:50.904 [2024-07-15 22:58:35.710440] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:50.904 [2024-07-15 22:58:35.710487] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:50.904 [2024-07-15 22:58:35.710518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:50.904 [2024-07-15 22:58:35.710588] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb01ea0 00:29:50.904 [2024-07-15 22:58:35.710599] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:50.904 [2024-07-15 22:58:35.710657] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xaffbc0 00:29:50.904 [2024-07-15 22:58:35.710728] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb01ea0 00:29:50.904 [2024-07-15 22:58:35.710738] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb01ea0 00:29:50.904 [2024-07-15 22:58:35.710791] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:50.904 pt2 00:29:50.904 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:50.904 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:50.904 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:50.904 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:50.904 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:50.904 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:50.904 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:50.905 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:50.905 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:50.905 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:50.905 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.905 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.164 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:51.164 "name": "raid_bdev1", 00:29:51.164 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:51.164 "strip_size_kb": 0, 00:29:51.164 "state": "online", 00:29:51.164 "raid_level": "raid1", 00:29:51.164 "superblock": true, 00:29:51.164 "num_base_bdevs": 2, 00:29:51.164 "num_base_bdevs_discovered": 1, 00:29:51.164 "num_base_bdevs_operational": 1, 00:29:51.164 "base_bdevs_list": [ 00:29:51.164 { 00:29:51.164 "name": null, 00:29:51.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.164 "is_configured": false, 00:29:51.164 "data_offset": 256, 00:29:51.164 "data_size": 7936 00:29:51.164 }, 00:29:51.164 { 00:29:51.164 "name": "pt2", 00:29:51.164 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:51.164 "is_configured": true, 00:29:51.164 "data_offset": 256, 00:29:51.164 "data_size": 7936 00:29:51.164 } 00:29:51.164 ] 00:29:51.164 }' 00:29:51.164 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:51.164 22:58:35 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:52.100 22:58:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:52.100 [2024-07-15 22:58:37.000297] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:52.100 [2024-07-15 22:58:37.000324] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:52.100 [2024-07-15 22:58:37.000380] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:52.100 [2024-07-15 22:58:37.000427] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:52.100 [2024-07-15 22:58:37.000439] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb01ea0 name raid_bdev1, state offline 00:29:52.359 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.359 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:52.927 [2024-07-15 22:58:37.758273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:52.927 [2024-07-15 22:58:37.758320] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:52.927 [2024-07-15 22:58:37.758338] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb00620 00:29:52.927 [2024-07-15 22:58:37.758351] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:52.927 [2024-07-15 22:58:37.759816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:52.927 [2024-07-15 22:58:37.759844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:52.927 [2024-07-15 22:58:37.759892] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:52.927 [2024-07-15 22:58:37.759932] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:52.927 [2024-07-15 22:58:37.760016] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:52.927 [2024-07-15 22:58:37.760030] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:52.927 [2024-07-15 22:58:37.760046] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb02640 name raid_bdev1, state configuring 00:29:52.927 [2024-07-15 22:58:37.760069] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:52.927 [2024-07-15 22:58:37.760132] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb02640 00:29:52.927 [2024-07-15 22:58:37.760143] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:52.927 [2024-07-15 22:58:37.760200] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb01810 00:29:52.927 [2024-07-15 22:58:37.760272] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb02640 00:29:52.927 [2024-07-15 22:58:37.760282] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb02640 00:29:52.927 [2024-07-15 22:58:37.760340] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:52.927 pt1 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.927 22:58:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:53.495 22:58:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:53.495 "name": "raid_bdev1", 00:29:53.495 "uuid": "8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2", 00:29:53.495 "strip_size_kb": 0, 00:29:53.495 "state": "online", 00:29:53.495 "raid_level": "raid1", 00:29:53.495 "superblock": true, 00:29:53.495 "num_base_bdevs": 2, 00:29:53.495 "num_base_bdevs_discovered": 1, 00:29:53.495 "num_base_bdevs_operational": 1, 00:29:53.495 "base_bdevs_list": [ 00:29:53.495 { 00:29:53.495 "name": null, 00:29:53.495 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:53.495 "is_configured": false, 00:29:53.495 "data_offset": 256, 00:29:53.495 "data_size": 7936 00:29:53.495 }, 00:29:53.495 { 00:29:53.495 "name": "pt2", 00:29:53.495 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:53.495 "is_configured": true, 00:29:53.495 "data_offset": 256, 00:29:53.495 "data_size": 7936 00:29:53.495 } 00:29:53.495 ] 00:29:53.495 }' 00:29:53.495 22:58:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:53.495 22:58:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:54.062 22:58:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:54.062 22:58:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:54.320 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:54.320 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:54.320 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:54.580 [2024-07-15 22:58:39.414917] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2 '!=' 8247a2fb-b04c-48e1-b4c5-ab445cc4c6c2 ']' 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2860423 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2860423 ']' 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2860423 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2860423 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2860423' 00:29:54.580 killing process with pid 2860423 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 2860423 00:29:54.580 [2024-07-15 22:58:39.487700] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:54.580 [2024-07-15 22:58:39.487761] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:54.580 [2024-07-15 22:58:39.487811] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:54.580 [2024-07-15 22:58:39.487828] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb02640 name raid_bdev1, state offline 00:29:54.580 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 2860423 00:29:54.839 [2024-07-15 22:58:39.507162] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:54.839 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:29:54.839 00:29:54.839 real 0m17.167s 00:29:54.839 user 0m31.212s 00:29:54.839 sys 0m3.073s 00:29:54.839 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:54.839 22:58:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:54.839 ************************************ 00:29:54.839 END TEST raid_superblock_test_md_interleaved 00:29:54.839 ************************************ 00:29:55.099 22:58:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:55.099 22:58:39 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:29:55.099 22:58:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:55.099 22:58:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:55.099 22:58:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:55.099 ************************************ 00:29:55.099 START TEST raid_rebuild_test_sb_md_interleaved 00:29:55.099 ************************************ 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2862915 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2862915 /var/tmp/spdk-raid.sock 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2862915 ']' 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:55.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:55.099 22:58:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:55.099 [2024-07-15 22:58:39.891315] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:29:55.099 [2024-07-15 22:58:39.891380] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2862915 ] 00:29:55.099 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:55.099 Zero copy mechanism will not be used. 00:29:55.358 [2024-07-15 22:58:40.010962] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:55.359 [2024-07-15 22:58:40.121079] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:55.359 [2024-07-15 22:58:40.183273] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:55.359 [2024-07-15 22:58:40.183312] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:56.307 22:58:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:56.307 22:58:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:56.307 22:58:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:56.307 22:58:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:29:56.307 BaseBdev1_malloc 00:29:56.307 22:58:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:56.566 [2024-07-15 22:58:41.328402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:56.566 [2024-07-15 22:58:41.328452] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:56.566 [2024-07-15 22:58:41.328476] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1165ce0 00:29:56.566 [2024-07-15 22:58:41.328488] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:56.566 [2024-07-15 22:58:41.329941] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:56.566 [2024-07-15 22:58:41.329969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:56.566 BaseBdev1 00:29:56.566 22:58:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:56.566 22:58:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:29:56.825 BaseBdev2_malloc 00:29:56.825 22:58:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:57.084 [2024-07-15 22:58:41.826753] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:57.084 [2024-07-15 22:58:41.826800] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:57.084 [2024-07-15 22:58:41.826823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x115d2d0 00:29:57.084 [2024-07-15 22:58:41.826835] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:57.084 [2024-07-15 22:58:41.828537] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:57.084 [2024-07-15 22:58:41.828564] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:57.084 BaseBdev2 00:29:57.084 22:58:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:29:57.343 spare_malloc 00:29:57.343 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:57.602 spare_delay 00:29:57.602 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:57.861 [2024-07-15 22:58:42.565500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:57.861 [2024-07-15 22:58:42.565544] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:57.861 [2024-07-15 22:58:42.565565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1160070 00:29:57.861 [2024-07-15 22:58:42.565578] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:57.861 [2024-07-15 22:58:42.566838] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:57.861 [2024-07-15 22:58:42.566864] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:57.861 spare 00:29:57.861 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:58.121 [2024-07-15 22:58:42.814188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:58.121 [2024-07-15 22:58:42.815430] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:58.121 [2024-07-15 22:58:42.815595] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1162370 00:29:58.121 [2024-07-15 22:58:42.815608] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:58.121 [2024-07-15 22:58:42.815676] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc89c0 00:29:58.121 [2024-07-15 22:58:42.815761] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1162370 00:29:58.121 [2024-07-15 22:58:42.815772] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1162370 00:29:58.121 [2024-07-15 22:58:42.815827] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:58.121 22:58:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:58.386 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:58.386 "name": "raid_bdev1", 00:29:58.386 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:29:58.386 "strip_size_kb": 0, 00:29:58.386 "state": "online", 00:29:58.386 "raid_level": "raid1", 00:29:58.386 "superblock": true, 00:29:58.386 "num_base_bdevs": 2, 00:29:58.386 "num_base_bdevs_discovered": 2, 00:29:58.386 "num_base_bdevs_operational": 2, 00:29:58.386 "base_bdevs_list": [ 00:29:58.386 { 00:29:58.386 "name": "BaseBdev1", 00:29:58.386 "uuid": "c9fa7fd9-6621-5d0e-a1a8-8a2a5773b5b3", 00:29:58.386 "is_configured": true, 00:29:58.386 "data_offset": 256, 00:29:58.386 "data_size": 7936 00:29:58.386 }, 00:29:58.386 { 00:29:58.386 "name": "BaseBdev2", 00:29:58.386 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:29:58.386 "is_configured": true, 00:29:58.386 "data_offset": 256, 00:29:58.386 "data_size": 7936 00:29:58.386 } 00:29:58.386 ] 00:29:58.386 }' 00:29:58.386 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:58.386 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:58.953 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:58.953 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:59.211 [2024-07-15 22:58:43.913349] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:59.211 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:59.211 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:59.212 22:58:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:59.470 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:59.470 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:59.470 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:29:59.470 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:59.730 [2024-07-15 22:58:44.406383] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:59.730 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:59.989 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:59.989 "name": "raid_bdev1", 00:29:59.989 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:29:59.989 "strip_size_kb": 0, 00:29:59.989 "state": "online", 00:29:59.989 "raid_level": "raid1", 00:29:59.989 "superblock": true, 00:29:59.989 "num_base_bdevs": 2, 00:29:59.989 "num_base_bdevs_discovered": 1, 00:29:59.989 "num_base_bdevs_operational": 1, 00:29:59.989 "base_bdevs_list": [ 00:29:59.989 { 00:29:59.989 "name": null, 00:29:59.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:59.989 "is_configured": false, 00:29:59.989 "data_offset": 256, 00:29:59.989 "data_size": 7936 00:29:59.989 }, 00:29:59.989 { 00:29:59.989 "name": "BaseBdev2", 00:29:59.989 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:29:59.989 "is_configured": true, 00:29:59.989 "data_offset": 256, 00:29:59.989 "data_size": 7936 00:29:59.989 } 00:29:59.989 ] 00:29:59.989 }' 00:29:59.989 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:59.989 22:58:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:00.557 22:58:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:00.557 [2024-07-15 22:58:45.429099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:00.557 [2024-07-15 22:58:45.432725] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1162250 00:30:00.557 [2024-07-15 22:58:45.434735] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:00.557 22:58:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:01.934 "name": "raid_bdev1", 00:30:01.934 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:01.934 "strip_size_kb": 0, 00:30:01.934 "state": "online", 00:30:01.934 "raid_level": "raid1", 00:30:01.934 "superblock": true, 00:30:01.934 "num_base_bdevs": 2, 00:30:01.934 "num_base_bdevs_discovered": 2, 00:30:01.934 "num_base_bdevs_operational": 2, 00:30:01.934 "process": { 00:30:01.934 "type": "rebuild", 00:30:01.934 "target": "spare", 00:30:01.934 "progress": { 00:30:01.934 "blocks": 2816, 00:30:01.934 "percent": 35 00:30:01.934 } 00:30:01.934 }, 00:30:01.934 "base_bdevs_list": [ 00:30:01.934 { 00:30:01.934 "name": "spare", 00:30:01.934 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:01.934 "is_configured": true, 00:30:01.934 "data_offset": 256, 00:30:01.934 "data_size": 7936 00:30:01.934 }, 00:30:01.934 { 00:30:01.934 "name": "BaseBdev2", 00:30:01.934 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:01.934 "is_configured": true, 00:30:01.934 "data_offset": 256, 00:30:01.934 "data_size": 7936 00:30:01.934 } 00:30:01.934 ] 00:30:01.934 }' 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:01.934 22:58:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:02.194 [2024-07-15 22:58:46.956060] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:02.194 [2024-07-15 22:58:47.047099] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:02.194 [2024-07-15 22:58:47.047148] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:02.194 [2024-07-15 22:58:47.047164] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:02.194 [2024-07-15 22:58:47.047173] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:02.194 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:02.453 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:02.453 "name": "raid_bdev1", 00:30:02.453 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:02.453 "strip_size_kb": 0, 00:30:02.453 "state": "online", 00:30:02.453 "raid_level": "raid1", 00:30:02.453 "superblock": true, 00:30:02.453 "num_base_bdevs": 2, 00:30:02.453 "num_base_bdevs_discovered": 1, 00:30:02.453 "num_base_bdevs_operational": 1, 00:30:02.453 "base_bdevs_list": [ 00:30:02.453 { 00:30:02.453 "name": null, 00:30:02.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:02.453 "is_configured": false, 00:30:02.453 "data_offset": 256, 00:30:02.453 "data_size": 7936 00:30:02.453 }, 00:30:02.453 { 00:30:02.453 "name": "BaseBdev2", 00:30:02.453 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:02.453 "is_configured": true, 00:30:02.453 "data_offset": 256, 00:30:02.453 "data_size": 7936 00:30:02.453 } 00:30:02.453 ] 00:30:02.453 }' 00:30:02.453 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:02.453 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:03.391 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:03.391 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:03.391 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:03.391 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:03.391 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:03.391 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.391 22:58:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:03.391 22:58:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:03.391 "name": "raid_bdev1", 00:30:03.391 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:03.391 "strip_size_kb": 0, 00:30:03.391 "state": "online", 00:30:03.391 "raid_level": "raid1", 00:30:03.391 "superblock": true, 00:30:03.391 "num_base_bdevs": 2, 00:30:03.391 "num_base_bdevs_discovered": 1, 00:30:03.391 "num_base_bdevs_operational": 1, 00:30:03.391 "base_bdevs_list": [ 00:30:03.391 { 00:30:03.391 "name": null, 00:30:03.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:03.391 "is_configured": false, 00:30:03.391 "data_offset": 256, 00:30:03.391 "data_size": 7936 00:30:03.391 }, 00:30:03.391 { 00:30:03.391 "name": "BaseBdev2", 00:30:03.391 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:03.391 "is_configured": true, 00:30:03.391 "data_offset": 256, 00:30:03.391 "data_size": 7936 00:30:03.391 } 00:30:03.391 ] 00:30:03.391 }' 00:30:03.391 22:58:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:03.391 22:58:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:03.391 22:58:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:03.391 22:58:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:03.392 22:58:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:03.651 [2024-07-15 22:58:48.495231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:03.651 [2024-07-15 22:58:48.499071] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x115e270 00:30:03.651 [2024-07-15 22:58:48.500511] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:03.651 22:58:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:05.029 "name": "raid_bdev1", 00:30:05.029 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:05.029 "strip_size_kb": 0, 00:30:05.029 "state": "online", 00:30:05.029 "raid_level": "raid1", 00:30:05.029 "superblock": true, 00:30:05.029 "num_base_bdevs": 2, 00:30:05.029 "num_base_bdevs_discovered": 2, 00:30:05.029 "num_base_bdevs_operational": 2, 00:30:05.029 "process": { 00:30:05.029 "type": "rebuild", 00:30:05.029 "target": "spare", 00:30:05.029 "progress": { 00:30:05.029 "blocks": 3072, 00:30:05.029 "percent": 38 00:30:05.029 } 00:30:05.029 }, 00:30:05.029 "base_bdevs_list": [ 00:30:05.029 { 00:30:05.029 "name": "spare", 00:30:05.029 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:05.029 "is_configured": true, 00:30:05.029 "data_offset": 256, 00:30:05.029 "data_size": 7936 00:30:05.029 }, 00:30:05.029 { 00:30:05.029 "name": "BaseBdev2", 00:30:05.029 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:05.029 "is_configured": true, 00:30:05.029 "data_offset": 256, 00:30:05.029 "data_size": 7936 00:30:05.029 } 00:30:05.029 ] 00:30:05.029 }' 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:30:05.029 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1177 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:05.029 22:58:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:05.288 22:58:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:05.288 "name": "raid_bdev1", 00:30:05.288 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:05.288 "strip_size_kb": 0, 00:30:05.288 "state": "online", 00:30:05.288 "raid_level": "raid1", 00:30:05.288 "superblock": true, 00:30:05.288 "num_base_bdevs": 2, 00:30:05.288 "num_base_bdevs_discovered": 2, 00:30:05.288 "num_base_bdevs_operational": 2, 00:30:05.288 "process": { 00:30:05.288 "type": "rebuild", 00:30:05.288 "target": "spare", 00:30:05.288 "progress": { 00:30:05.288 "blocks": 3840, 00:30:05.288 "percent": 48 00:30:05.288 } 00:30:05.288 }, 00:30:05.288 "base_bdevs_list": [ 00:30:05.288 { 00:30:05.288 "name": "spare", 00:30:05.288 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:05.288 "is_configured": true, 00:30:05.288 "data_offset": 256, 00:30:05.288 "data_size": 7936 00:30:05.288 }, 00:30:05.288 { 00:30:05.288 "name": "BaseBdev2", 00:30:05.288 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:05.288 "is_configured": true, 00:30:05.288 "data_offset": 256, 00:30:05.288 "data_size": 7936 00:30:05.288 } 00:30:05.288 ] 00:30:05.288 }' 00:30:05.288 22:58:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:05.545 22:58:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:05.545 22:58:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:05.545 22:58:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:05.545 22:58:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:06.477 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:06.478 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:06.478 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:06.478 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:06.478 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:06.478 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:06.478 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:06.478 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:06.736 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:06.736 "name": "raid_bdev1", 00:30:06.736 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:06.736 "strip_size_kb": 0, 00:30:06.736 "state": "online", 00:30:06.736 "raid_level": "raid1", 00:30:06.736 "superblock": true, 00:30:06.736 "num_base_bdevs": 2, 00:30:06.736 "num_base_bdevs_discovered": 2, 00:30:06.736 "num_base_bdevs_operational": 2, 00:30:06.736 "process": { 00:30:06.736 "type": "rebuild", 00:30:06.736 "target": "spare", 00:30:06.736 "progress": { 00:30:06.736 "blocks": 7424, 00:30:06.736 "percent": 93 00:30:06.736 } 00:30:06.736 }, 00:30:06.736 "base_bdevs_list": [ 00:30:06.736 { 00:30:06.736 "name": "spare", 00:30:06.736 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:06.736 "is_configured": true, 00:30:06.736 "data_offset": 256, 00:30:06.736 "data_size": 7936 00:30:06.736 }, 00:30:06.736 { 00:30:06.736 "name": "BaseBdev2", 00:30:06.736 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:06.736 "is_configured": true, 00:30:06.736 "data_offset": 256, 00:30:06.736 "data_size": 7936 00:30:06.736 } 00:30:06.736 ] 00:30:06.736 }' 00:30:06.736 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:06.736 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:06.736 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:06.736 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:06.736 22:58:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:06.736 [2024-07-15 22:58:51.624693] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:06.736 [2024-07-15 22:58:51.624755] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:06.736 [2024-07-15 22:58:51.624845] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:07.696 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:07.962 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:07.962 "name": "raid_bdev1", 00:30:07.962 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:07.962 "strip_size_kb": 0, 00:30:07.962 "state": "online", 00:30:07.962 "raid_level": "raid1", 00:30:07.962 "superblock": true, 00:30:07.962 "num_base_bdevs": 2, 00:30:07.962 "num_base_bdevs_discovered": 2, 00:30:07.962 "num_base_bdevs_operational": 2, 00:30:07.962 "base_bdevs_list": [ 00:30:07.962 { 00:30:07.962 "name": "spare", 00:30:07.962 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:07.962 "is_configured": true, 00:30:07.962 "data_offset": 256, 00:30:07.962 "data_size": 7936 00:30:07.962 }, 00:30:07.962 { 00:30:07.962 "name": "BaseBdev2", 00:30:07.962 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:07.962 "is_configured": true, 00:30:07.962 "data_offset": 256, 00:30:07.962 "data_size": 7936 00:30:07.962 } 00:30:07.962 ] 00:30:07.962 }' 00:30:07.962 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.219 22:58:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:08.476 "name": "raid_bdev1", 00:30:08.476 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:08.476 "strip_size_kb": 0, 00:30:08.476 "state": "online", 00:30:08.476 "raid_level": "raid1", 00:30:08.476 "superblock": true, 00:30:08.476 "num_base_bdevs": 2, 00:30:08.476 "num_base_bdevs_discovered": 2, 00:30:08.476 "num_base_bdevs_operational": 2, 00:30:08.476 "base_bdevs_list": [ 00:30:08.476 { 00:30:08.476 "name": "spare", 00:30:08.476 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:08.476 "is_configured": true, 00:30:08.476 "data_offset": 256, 00:30:08.476 "data_size": 7936 00:30:08.476 }, 00:30:08.476 { 00:30:08.476 "name": "BaseBdev2", 00:30:08.476 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:08.476 "is_configured": true, 00:30:08.476 "data_offset": 256, 00:30:08.476 "data_size": 7936 00:30:08.476 } 00:30:08.476 ] 00:30:08.476 }' 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.476 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:08.734 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:08.734 "name": "raid_bdev1", 00:30:08.734 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:08.734 "strip_size_kb": 0, 00:30:08.734 "state": "online", 00:30:08.734 "raid_level": "raid1", 00:30:08.734 "superblock": true, 00:30:08.734 "num_base_bdevs": 2, 00:30:08.734 "num_base_bdevs_discovered": 2, 00:30:08.734 "num_base_bdevs_operational": 2, 00:30:08.734 "base_bdevs_list": [ 00:30:08.734 { 00:30:08.734 "name": "spare", 00:30:08.734 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:08.734 "is_configured": true, 00:30:08.734 "data_offset": 256, 00:30:08.734 "data_size": 7936 00:30:08.734 }, 00:30:08.734 { 00:30:08.734 "name": "BaseBdev2", 00:30:08.734 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:08.734 "is_configured": true, 00:30:08.734 "data_offset": 256, 00:30:08.734 "data_size": 7936 00:30:08.734 } 00:30:08.734 ] 00:30:08.734 }' 00:30:08.734 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:08.734 22:58:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:09.301 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:09.559 [2024-07-15 22:58:54.308377] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:09.559 [2024-07-15 22:58:54.308409] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:09.559 [2024-07-15 22:58:54.308476] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:09.559 [2024-07-15 22:58:54.308535] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:09.559 [2024-07-15 22:58:54.308547] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1162370 name raid_bdev1, state offline 00:30:09.559 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.560 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:30:09.817 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:30:09.817 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:30:09.817 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:30:09.817 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:10.076 22:58:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:10.333 [2024-07-15 22:58:54.998166] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:10.333 [2024-07-15 22:58:54.998209] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:10.333 [2024-07-15 22:58:54.998230] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1162040 00:30:10.333 [2024-07-15 22:58:54.998242] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:10.333 [2024-07-15 22:58:54.999702] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:10.333 [2024-07-15 22:58:54.999731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:10.333 [2024-07-15 22:58:54.999791] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:10.333 [2024-07-15 22:58:54.999818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:10.333 [2024-07-15 22:58:54.999904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:10.333 spare 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:10.333 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:10.334 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:10.334 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:10.334 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:10.334 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:10.334 [2024-07-15 22:58:55.100224] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11640d0 00:30:10.334 [2024-07-15 22:58:55.100240] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:10.334 [2024-07-15 22:58:55.100321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1157500 00:30:10.334 [2024-07-15 22:58:55.100416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11640d0 00:30:10.334 [2024-07-15 22:58:55.100426] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11640d0 00:30:10.334 [2024-07-15 22:58:55.100496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:10.591 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:10.591 "name": "raid_bdev1", 00:30:10.592 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:10.592 "strip_size_kb": 0, 00:30:10.592 "state": "online", 00:30:10.592 "raid_level": "raid1", 00:30:10.592 "superblock": true, 00:30:10.592 "num_base_bdevs": 2, 00:30:10.592 "num_base_bdevs_discovered": 2, 00:30:10.592 "num_base_bdevs_operational": 2, 00:30:10.592 "base_bdevs_list": [ 00:30:10.592 { 00:30:10.592 "name": "spare", 00:30:10.592 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:10.592 "is_configured": true, 00:30:10.592 "data_offset": 256, 00:30:10.592 "data_size": 7936 00:30:10.592 }, 00:30:10.592 { 00:30:10.592 "name": "BaseBdev2", 00:30:10.592 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:10.592 "is_configured": true, 00:30:10.592 "data_offset": 256, 00:30:10.592 "data_size": 7936 00:30:10.592 } 00:30:10.592 ] 00:30:10.592 }' 00:30:10.592 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:10.592 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:11.157 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:11.157 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:11.157 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:11.157 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:11.157 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:11.157 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.158 22:58:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:11.415 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:11.416 "name": "raid_bdev1", 00:30:11.416 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:11.416 "strip_size_kb": 0, 00:30:11.416 "state": "online", 00:30:11.416 "raid_level": "raid1", 00:30:11.416 "superblock": true, 00:30:11.416 "num_base_bdevs": 2, 00:30:11.416 "num_base_bdevs_discovered": 2, 00:30:11.416 "num_base_bdevs_operational": 2, 00:30:11.416 "base_bdevs_list": [ 00:30:11.416 { 00:30:11.416 "name": "spare", 00:30:11.416 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:11.416 "is_configured": true, 00:30:11.416 "data_offset": 256, 00:30:11.416 "data_size": 7936 00:30:11.416 }, 00:30:11.416 { 00:30:11.416 "name": "BaseBdev2", 00:30:11.416 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:11.416 "is_configured": true, 00:30:11.416 "data_offset": 256, 00:30:11.416 "data_size": 7936 00:30:11.416 } 00:30:11.416 ] 00:30:11.416 }' 00:30:11.416 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:11.416 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:11.416 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:11.416 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:11.416 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.416 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:30:11.675 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:30:11.675 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:11.933 [2024-07-15 22:58:56.674728] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:11.933 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.192 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:12.192 "name": "raid_bdev1", 00:30:12.192 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:12.192 "strip_size_kb": 0, 00:30:12.192 "state": "online", 00:30:12.192 "raid_level": "raid1", 00:30:12.192 "superblock": true, 00:30:12.192 "num_base_bdevs": 2, 00:30:12.192 "num_base_bdevs_discovered": 1, 00:30:12.192 "num_base_bdevs_operational": 1, 00:30:12.192 "base_bdevs_list": [ 00:30:12.192 { 00:30:12.192 "name": null, 00:30:12.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:12.192 "is_configured": false, 00:30:12.192 "data_offset": 256, 00:30:12.192 "data_size": 7936 00:30:12.192 }, 00:30:12.192 { 00:30:12.192 "name": "BaseBdev2", 00:30:12.192 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:12.192 "is_configured": true, 00:30:12.192 "data_offset": 256, 00:30:12.192 "data_size": 7936 00:30:12.192 } 00:30:12.192 ] 00:30:12.192 }' 00:30:12.192 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:12.192 22:58:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:12.784 22:58:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:13.043 [2024-07-15 22:58:57.769636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:13.043 [2024-07-15 22:58:57.769801] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:13.043 [2024-07-15 22:58:57.769819] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:13.043 [2024-07-15 22:58:57.769846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:13.043 [2024-07-15 22:58:57.773337] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc9640 00:30:13.043 [2024-07-15 22:58:57.774756] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:13.043 22:58:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:30:13.979 22:58:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:13.979 22:58:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:13.979 22:58:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:13.979 22:58:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:13.979 22:58:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:13.979 22:58:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:13.979 22:58:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:14.547 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:14.547 "name": "raid_bdev1", 00:30:14.547 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:14.548 "strip_size_kb": 0, 00:30:14.548 "state": "online", 00:30:14.548 "raid_level": "raid1", 00:30:14.548 "superblock": true, 00:30:14.548 "num_base_bdevs": 2, 00:30:14.548 "num_base_bdevs_discovered": 2, 00:30:14.548 "num_base_bdevs_operational": 2, 00:30:14.548 "process": { 00:30:14.548 "type": "rebuild", 00:30:14.548 "target": "spare", 00:30:14.548 "progress": { 00:30:14.548 "blocks": 3584, 00:30:14.548 "percent": 45 00:30:14.548 } 00:30:14.548 }, 00:30:14.548 "base_bdevs_list": [ 00:30:14.548 { 00:30:14.548 "name": "spare", 00:30:14.548 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:14.548 "is_configured": true, 00:30:14.548 "data_offset": 256, 00:30:14.548 "data_size": 7936 00:30:14.548 }, 00:30:14.548 { 00:30:14.548 "name": "BaseBdev2", 00:30:14.548 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:14.548 "is_configured": true, 00:30:14.548 "data_offset": 256, 00:30:14.548 "data_size": 7936 00:30:14.548 } 00:30:14.548 ] 00:30:14.548 }' 00:30:14.548 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:14.548 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:14.548 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:14.548 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:14.548 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:14.807 [2024-07-15 22:58:59.632906] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:14.807 [2024-07-15 22:58:59.689508] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:14.807 [2024-07-15 22:58:59.689552] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:14.807 [2024-07-15 22:58:59.689568] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:14.807 [2024-07-15 22:58:59.689577] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.067 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:15.067 "name": "raid_bdev1", 00:30:15.067 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:15.067 "strip_size_kb": 0, 00:30:15.067 "state": "online", 00:30:15.067 "raid_level": "raid1", 00:30:15.067 "superblock": true, 00:30:15.067 "num_base_bdevs": 2, 00:30:15.067 "num_base_bdevs_discovered": 1, 00:30:15.067 "num_base_bdevs_operational": 1, 00:30:15.067 "base_bdevs_list": [ 00:30:15.067 { 00:30:15.067 "name": null, 00:30:15.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:15.068 "is_configured": false, 00:30:15.068 "data_offset": 256, 00:30:15.068 "data_size": 7936 00:30:15.068 }, 00:30:15.068 { 00:30:15.068 "name": "BaseBdev2", 00:30:15.068 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:15.068 "is_configured": true, 00:30:15.068 "data_offset": 256, 00:30:15.068 "data_size": 7936 00:30:15.068 } 00:30:15.068 ] 00:30:15.068 }' 00:30:15.068 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:15.068 22:58:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:16.004 22:59:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:16.004 [2024-07-15 22:59:00.788721] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:16.004 [2024-07-15 22:59:00.788780] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:16.004 [2024-07-15 22:59:00.788803] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1161c80 00:30:16.004 [2024-07-15 22:59:00.788816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:16.004 [2024-07-15 22:59:00.789041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:16.004 [2024-07-15 22:59:00.789057] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:16.004 [2024-07-15 22:59:00.789121] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:16.004 [2024-07-15 22:59:00.789134] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:16.004 [2024-07-15 22:59:00.789144] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:16.004 [2024-07-15 22:59:00.789163] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:16.004 [2024-07-15 22:59:00.793019] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11622d0 00:30:16.004 spare 00:30:16.004 [2024-07-15 22:59:00.794390] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:16.004 22:59:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:30:16.941 22:59:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:16.941 22:59:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:16.941 22:59:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:16.941 22:59:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:16.941 22:59:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:16.941 22:59:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:16.941 22:59:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:17.200 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:17.200 "name": "raid_bdev1", 00:30:17.200 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:17.200 "strip_size_kb": 0, 00:30:17.200 "state": "online", 00:30:17.200 "raid_level": "raid1", 00:30:17.200 "superblock": true, 00:30:17.200 "num_base_bdevs": 2, 00:30:17.200 "num_base_bdevs_discovered": 2, 00:30:17.200 "num_base_bdevs_operational": 2, 00:30:17.200 "process": { 00:30:17.200 "type": "rebuild", 00:30:17.200 "target": "spare", 00:30:17.200 "progress": { 00:30:17.200 "blocks": 3072, 00:30:17.200 "percent": 38 00:30:17.200 } 00:30:17.200 }, 00:30:17.200 "base_bdevs_list": [ 00:30:17.200 { 00:30:17.200 "name": "spare", 00:30:17.200 "uuid": "01225df0-642c-5637-9c98-b61432b24a48", 00:30:17.200 "is_configured": true, 00:30:17.200 "data_offset": 256, 00:30:17.200 "data_size": 7936 00:30:17.200 }, 00:30:17.200 { 00:30:17.200 "name": "BaseBdev2", 00:30:17.200 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:17.200 "is_configured": true, 00:30:17.200 "data_offset": 256, 00:30:17.200 "data_size": 7936 00:30:17.200 } 00:30:17.200 ] 00:30:17.200 }' 00:30:17.200 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:17.200 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:17.200 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:17.459 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:17.459 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:17.459 [2024-07-15 22:59:02.362155] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:17.719 [2024-07-15 22:59:02.407148] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:17.719 [2024-07-15 22:59:02.407191] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:17.719 [2024-07-15 22:59:02.407206] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:17.719 [2024-07-15 22:59:02.407215] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:17.719 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:17.977 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:17.977 "name": "raid_bdev1", 00:30:17.977 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:17.977 "strip_size_kb": 0, 00:30:17.977 "state": "online", 00:30:17.977 "raid_level": "raid1", 00:30:17.977 "superblock": true, 00:30:17.977 "num_base_bdevs": 2, 00:30:17.977 "num_base_bdevs_discovered": 1, 00:30:17.978 "num_base_bdevs_operational": 1, 00:30:17.978 "base_bdevs_list": [ 00:30:17.978 { 00:30:17.978 "name": null, 00:30:17.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:17.978 "is_configured": false, 00:30:17.978 "data_offset": 256, 00:30:17.978 "data_size": 7936 00:30:17.978 }, 00:30:17.978 { 00:30:17.978 "name": "BaseBdev2", 00:30:17.978 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:17.978 "is_configured": true, 00:30:17.978 "data_offset": 256, 00:30:17.978 "data_size": 7936 00:30:17.978 } 00:30:17.978 ] 00:30:17.978 }' 00:30:17.978 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:17.978 22:59:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:18.547 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:18.547 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:18.547 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:18.547 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:18.547 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:18.547 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:18.547 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:18.806 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:18.806 "name": "raid_bdev1", 00:30:18.806 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:18.806 "strip_size_kb": 0, 00:30:18.806 "state": "online", 00:30:18.806 "raid_level": "raid1", 00:30:18.806 "superblock": true, 00:30:18.806 "num_base_bdevs": 2, 00:30:18.806 "num_base_bdevs_discovered": 1, 00:30:18.806 "num_base_bdevs_operational": 1, 00:30:18.806 "base_bdevs_list": [ 00:30:18.806 { 00:30:18.806 "name": null, 00:30:18.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:18.806 "is_configured": false, 00:30:18.806 "data_offset": 256, 00:30:18.806 "data_size": 7936 00:30:18.806 }, 00:30:18.806 { 00:30:18.806 "name": "BaseBdev2", 00:30:18.806 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:18.806 "is_configured": true, 00:30:18.806 "data_offset": 256, 00:30:18.806 "data_size": 7936 00:30:18.806 } 00:30:18.806 ] 00:30:18.806 }' 00:30:18.806 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:18.806 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:18.806 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:18.806 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:18.806 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:30:19.065 22:59:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:19.324 [2024-07-15 22:59:03.995141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:19.324 [2024-07-15 22:59:03.995192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:19.324 [2024-07-15 22:59:03.995215] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc9fa0 00:30:19.324 [2024-07-15 22:59:03.995228] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:19.324 [2024-07-15 22:59:03.995402] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:19.324 [2024-07-15 22:59:03.995417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:19.324 [2024-07-15 22:59:03.995465] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:30:19.324 [2024-07-15 22:59:03.995477] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:19.324 [2024-07-15 22:59:03.995488] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:19.324 BaseBdev1 00:30:19.324 22:59:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:20.259 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:20.518 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:20.518 "name": "raid_bdev1", 00:30:20.518 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:20.518 "strip_size_kb": 0, 00:30:20.518 "state": "online", 00:30:20.518 "raid_level": "raid1", 00:30:20.518 "superblock": true, 00:30:20.518 "num_base_bdevs": 2, 00:30:20.518 "num_base_bdevs_discovered": 1, 00:30:20.518 "num_base_bdevs_operational": 1, 00:30:20.518 "base_bdevs_list": [ 00:30:20.518 { 00:30:20.518 "name": null, 00:30:20.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:20.518 "is_configured": false, 00:30:20.518 "data_offset": 256, 00:30:20.518 "data_size": 7936 00:30:20.518 }, 00:30:20.518 { 00:30:20.518 "name": "BaseBdev2", 00:30:20.518 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:20.518 "is_configured": true, 00:30:20.518 "data_offset": 256, 00:30:20.518 "data_size": 7936 00:30:20.518 } 00:30:20.518 ] 00:30:20.518 }' 00:30:20.518 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:20.518 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:21.084 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:21.084 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:21.084 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:21.084 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:21.084 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:21.084 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.084 22:59:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:21.344 "name": "raid_bdev1", 00:30:21.344 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:21.344 "strip_size_kb": 0, 00:30:21.344 "state": "online", 00:30:21.344 "raid_level": "raid1", 00:30:21.344 "superblock": true, 00:30:21.344 "num_base_bdevs": 2, 00:30:21.344 "num_base_bdevs_discovered": 1, 00:30:21.344 "num_base_bdevs_operational": 1, 00:30:21.344 "base_bdevs_list": [ 00:30:21.344 { 00:30:21.344 "name": null, 00:30:21.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.344 "is_configured": false, 00:30:21.344 "data_offset": 256, 00:30:21.344 "data_size": 7936 00:30:21.344 }, 00:30:21.344 { 00:30:21.344 "name": "BaseBdev2", 00:30:21.344 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:21.344 "is_configured": true, 00:30:21.344 "data_offset": 256, 00:30:21.344 "data_size": 7936 00:30:21.344 } 00:30:21.344 ] 00:30:21.344 }' 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:21.344 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:21.603 [2024-07-15 22:59:06.393558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:21.603 [2024-07-15 22:59:06.393689] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:21.603 [2024-07-15 22:59:06.393705] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:21.603 request: 00:30:21.603 { 00:30:21.603 "base_bdev": "BaseBdev1", 00:30:21.603 "raid_bdev": "raid_bdev1", 00:30:21.603 "method": "bdev_raid_add_base_bdev", 00:30:21.603 "req_id": 1 00:30:21.603 } 00:30:21.603 Got JSON-RPC error response 00:30:21.603 response: 00:30:21.603 { 00:30:21.603 "code": -22, 00:30:21.603 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:30:21.603 } 00:30:21.603 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:30:21.603 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:30:21.603 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:30:21.603 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:30:21.603 22:59:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:22.539 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:22.798 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:22.798 "name": "raid_bdev1", 00:30:22.798 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:22.798 "strip_size_kb": 0, 00:30:22.798 "state": "online", 00:30:22.798 "raid_level": "raid1", 00:30:22.798 "superblock": true, 00:30:22.798 "num_base_bdevs": 2, 00:30:22.798 "num_base_bdevs_discovered": 1, 00:30:22.798 "num_base_bdevs_operational": 1, 00:30:22.798 "base_bdevs_list": [ 00:30:22.798 { 00:30:22.798 "name": null, 00:30:22.798 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:22.798 "is_configured": false, 00:30:22.798 "data_offset": 256, 00:30:22.798 "data_size": 7936 00:30:22.798 }, 00:30:22.798 { 00:30:22.798 "name": "BaseBdev2", 00:30:22.798 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:22.798 "is_configured": true, 00:30:22.798 "data_offset": 256, 00:30:22.798 "data_size": 7936 00:30:22.798 } 00:30:22.798 ] 00:30:22.798 }' 00:30:22.798 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:22.798 22:59:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:23.735 "name": "raid_bdev1", 00:30:23.735 "uuid": "cde447ce-dd50-4a8e-8e4a-bc915b20295b", 00:30:23.735 "strip_size_kb": 0, 00:30:23.735 "state": "online", 00:30:23.735 "raid_level": "raid1", 00:30:23.735 "superblock": true, 00:30:23.735 "num_base_bdevs": 2, 00:30:23.735 "num_base_bdevs_discovered": 1, 00:30:23.735 "num_base_bdevs_operational": 1, 00:30:23.735 "base_bdevs_list": [ 00:30:23.735 { 00:30:23.735 "name": null, 00:30:23.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:23.735 "is_configured": false, 00:30:23.735 "data_offset": 256, 00:30:23.735 "data_size": 7936 00:30:23.735 }, 00:30:23.735 { 00:30:23.735 "name": "BaseBdev2", 00:30:23.735 "uuid": "3d6abf49-1869-531f-99f9-27bb295efbcf", 00:30:23.735 "is_configured": true, 00:30:23.735 "data_offset": 256, 00:30:23.735 "data_size": 7936 00:30:23.735 } 00:30:23.735 ] 00:30:23.735 }' 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2862915 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2862915 ']' 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2862915 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:23.735 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2862915 00:30:23.994 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:23.994 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:23.994 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2862915' 00:30:23.994 killing process with pid 2862915 00:30:23.994 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2862915 00:30:23.994 Received shutdown signal, test time was about 60.000000 seconds 00:30:23.994 00:30:23.994 Latency(us) 00:30:23.994 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:23.994 =================================================================================================================== 00:30:23.994 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:30:23.994 [2024-07-15 22:59:08.664813] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:23.994 [2024-07-15 22:59:08.664903] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:23.994 [2024-07-15 22:59:08.664956] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:23.994 [2024-07-15 22:59:08.664970] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11640d0 name raid_bdev1, state offline 00:30:23.994 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2862915 00:30:23.994 [2024-07-15 22:59:08.692826] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:24.254 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:30:24.254 00:30:24.254 real 0m29.092s 00:30:24.254 user 0m46.322s 00:30:24.254 sys 0m3.963s 00:30:24.254 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:24.254 22:59:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:24.254 ************************************ 00:30:24.254 END TEST raid_rebuild_test_sb_md_interleaved 00:30:24.254 ************************************ 00:30:24.254 22:59:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:24.254 22:59:08 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:30:24.254 22:59:08 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:30:24.254 22:59:08 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2862915 ']' 00:30:24.254 22:59:08 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2862915 00:30:24.254 22:59:08 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:30:24.254 00:30:24.254 real 19m26.358s 00:30:24.254 user 33m1.220s 00:30:24.254 sys 3m31.209s 00:30:24.254 22:59:09 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:24.254 22:59:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:24.254 ************************************ 00:30:24.254 END TEST bdev_raid 00:30:24.254 ************************************ 00:30:24.254 22:59:09 -- common/autotest_common.sh@1142 -- # return 0 00:30:24.254 22:59:09 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:30:24.254 22:59:09 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:24.254 22:59:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:24.254 22:59:09 -- common/autotest_common.sh@10 -- # set +x 00:30:24.254 ************************************ 00:30:24.254 START TEST bdevperf_config 00:30:24.254 ************************************ 00:30:24.254 22:59:09 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:30:24.513 * Looking for test storage... 00:30:24.513 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:30:24.513 22:59:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:24.514 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:24.514 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:24.514 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:24.514 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:24.514 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:24.514 22:59:09 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:27.142 22:59:11 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 22:59:09.303352] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:27.142 [2024-07-15 22:59:09.303419] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2867073 ] 00:30:27.142 Using job config with 4 jobs 00:30:27.142 [2024-07-15 22:59:09.448043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:27.142 [2024-07-15 22:59:09.563575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:27.142 cpumask for '\''job0'\'' is too big 00:30:27.142 cpumask for '\''job1'\'' is too big 00:30:27.142 cpumask for '\''job2'\'' is too big 00:30:27.142 cpumask for '\''job3'\'' is too big 00:30:27.142 Running I/O for 2 seconds... 00:30:27.142 00:30:27.142 Latency(us) 00:30:27.142 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.01 23896.63 23.34 0.00 0.00 10697.68 1880.60 16412.49 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23874.50 23.31 0.00 0.00 10683.75 1852.10 14588.88 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23915.56 23.36 0.00 0.00 10641.94 1852.10 12708.29 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23893.65 23.33 0.00 0.00 10627.76 1852.10 10998.65 00:30:27.142 =================================================================================================================== 00:30:27.142 Total : 95580.34 93.34 0.00 0.00 10662.71 1852.10 16412.49' 00:30:27.142 22:59:11 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 22:59:09.303352] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:27.142 [2024-07-15 22:59:09.303419] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2867073 ] 00:30:27.142 Using job config with 4 jobs 00:30:27.142 [2024-07-15 22:59:09.448043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:27.142 [2024-07-15 22:59:09.563575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:27.142 cpumask for '\''job0'\'' is too big 00:30:27.142 cpumask for '\''job1'\'' is too big 00:30:27.142 cpumask for '\''job2'\'' is too big 00:30:27.142 cpumask for '\''job3'\'' is too big 00:30:27.142 Running I/O for 2 seconds... 00:30:27.142 00:30:27.142 Latency(us) 00:30:27.142 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.01 23896.63 23.34 0.00 0.00 10697.68 1880.60 16412.49 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23874.50 23.31 0.00 0.00 10683.75 1852.10 14588.88 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23915.56 23.36 0.00 0.00 10641.94 1852.10 12708.29 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23893.65 23.33 0.00 0.00 10627.76 1852.10 10998.65 00:30:27.142 =================================================================================================================== 00:30:27.142 Total : 95580.34 93.34 0.00 0.00 10662.71 1852.10 16412.49' 00:30:27.142 22:59:11 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:27.142 22:59:11 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 22:59:09.303352] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:27.142 [2024-07-15 22:59:09.303419] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2867073 ] 00:30:27.142 Using job config with 4 jobs 00:30:27.142 [2024-07-15 22:59:09.448043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:27.142 [2024-07-15 22:59:09.563575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:27.142 cpumask for '\''job0'\'' is too big 00:30:27.142 cpumask for '\''job1'\'' is too big 00:30:27.142 cpumask for '\''job2'\'' is too big 00:30:27.142 cpumask for '\''job3'\'' is too big 00:30:27.142 Running I/O for 2 seconds... 00:30:27.142 00:30:27.142 Latency(us) 00:30:27.142 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.01 23896.63 23.34 0.00 0.00 10697.68 1880.60 16412.49 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23874.50 23.31 0.00 0.00 10683.75 1852.10 14588.88 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23915.56 23.36 0.00 0.00 10641.94 1852.10 12708.29 00:30:27.142 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:27.142 Malloc0 : 2.02 23893.65 23.33 0.00 0.00 10627.76 1852.10 10998.65 00:30:27.142 =================================================================================================================== 00:30:27.142 Total : 95580.34 93.34 0.00 0.00 10662.71 1852.10 16412.49' 00:30:27.142 22:59:11 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:27.142 22:59:11 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:30:27.142 22:59:11 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:27.401 [2024-07-15 22:59:12.107994] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:27.401 [2024-07-15 22:59:12.108134] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2867427 ] 00:30:27.659 [2024-07-15 22:59:12.318793] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:27.659 [2024-07-15 22:59:12.446396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:27.917 cpumask for 'job0' is too big 00:30:27.917 cpumask for 'job1' is too big 00:30:27.917 cpumask for 'job2' is too big 00:30:27.917 cpumask for 'job3' is too big 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:30:30.448 Running I/O for 2 seconds... 00:30:30.448 00:30:30.448 Latency(us) 00:30:30.448 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:30.448 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:30.448 Malloc0 : 2.01 23900.26 23.34 0.00 0.00 10698.14 1866.35 16412.49 00:30:30.448 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:30.448 Malloc0 : 2.02 23878.17 23.32 0.00 0.00 10682.93 1852.10 14531.90 00:30:30.448 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:30.448 Malloc0 : 2.02 23919.31 23.36 0.00 0.00 10640.80 1852.10 12651.30 00:30:30.448 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:30.448 Malloc0 : 2.02 23897.45 23.34 0.00 0.00 10626.28 1837.86 10941.66 00:30:30.448 =================================================================================================================== 00:30:30.448 Total : 95595.19 93.35 0.00 0.00 10661.96 1837.86 16412.49' 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:30.448 00:30:30.448 22:59:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:30.449 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:30.449 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:30.449 22:59:14 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 22:59:15.019249] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:32.984 [2024-07-15 22:59:15.019387] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2867781 ] 00:30:32.984 Using job config with 3 jobs 00:30:32.984 [2024-07-15 22:59:15.241128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:32.984 [2024-07-15 22:59:15.365263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:32.984 cpumask for '\''job0'\'' is too big 00:30:32.984 cpumask for '\''job1'\'' is too big 00:30:32.984 cpumask for '\''job2'\'' is too big 00:30:32.984 Running I/O for 2 seconds... 00:30:32.984 00:30:32.984 Latency(us) 00:30:32.984 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.01 32330.24 31.57 0.00 0.00 7904.37 1823.61 11625.52 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.01 32300.21 31.54 0.00 0.00 7893.51 1816.49 9801.91 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.02 32354.46 31.60 0.00 0.00 7862.67 940.30 8149.26 00:30:32.984 =================================================================================================================== 00:30:32.984 Total : 96984.90 94.71 0.00 0.00 7886.82 940.30 11625.52' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 22:59:15.019249] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:32.984 [2024-07-15 22:59:15.019387] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2867781 ] 00:30:32.984 Using job config with 3 jobs 00:30:32.984 [2024-07-15 22:59:15.241128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:32.984 [2024-07-15 22:59:15.365263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:32.984 cpumask for '\''job0'\'' is too big 00:30:32.984 cpumask for '\''job1'\'' is too big 00:30:32.984 cpumask for '\''job2'\'' is too big 00:30:32.984 Running I/O for 2 seconds... 00:30:32.984 00:30:32.984 Latency(us) 00:30:32.984 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.01 32330.24 31.57 0.00 0.00 7904.37 1823.61 11625.52 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.01 32300.21 31.54 0.00 0.00 7893.51 1816.49 9801.91 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.02 32354.46 31.60 0.00 0.00 7862.67 940.30 8149.26 00:30:32.984 =================================================================================================================== 00:30:32.984 Total : 96984.90 94.71 0.00 0.00 7886.82 940.30 11625.52' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 22:59:15.019249] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:32.984 [2024-07-15 22:59:15.019387] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2867781 ] 00:30:32.984 Using job config with 3 jobs 00:30:32.984 [2024-07-15 22:59:15.241128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:32.984 [2024-07-15 22:59:15.365263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:32.984 cpumask for '\''job0'\'' is too big 00:30:32.984 cpumask for '\''job1'\'' is too big 00:30:32.984 cpumask for '\''job2'\'' is too big 00:30:32.984 Running I/O for 2 seconds... 00:30:32.984 00:30:32.984 Latency(us) 00:30:32.984 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.01 32330.24 31.57 0.00 0.00 7904.37 1823.61 11625.52 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.01 32300.21 31.54 0.00 0.00 7893.51 1816.49 9801.91 00:30:32.984 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:32.984 Malloc0 : 2.02 32354.46 31.60 0.00 0.00 7862.67 940.30 8149.26 00:30:32.984 =================================================================================================================== 00:30:32.984 Total : 96984.90 94.71 0.00 0.00 7886.82 940.30 11625.52' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:32.984 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:32.984 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:32.984 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:32.984 22:59:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:32.984 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:32.985 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:32.985 22:59:17 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:36.276 22:59:20 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 22:59:17.872720] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:36.276 [2024-07-15 22:59:17.872789] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2868147 ] 00:30:36.276 Using job config with 4 jobs 00:30:36.276 [2024-07-15 22:59:18.020283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.276 [2024-07-15 22:59:18.145057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.276 cpumask for '\''job0'\'' is too big 00:30:36.276 cpumask for '\''job1'\'' is too big 00:30:36.276 cpumask for '\''job2'\'' is too big 00:30:36.276 cpumask for '\''job3'\'' is too big 00:30:36.276 Running I/O for 2 seconds... 00:30:36.276 00:30:36.276 Latency(us) 00:30:36.276 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.276 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc0 : 2.02 11885.86 11.61 0.00 0.00 21515.83 3846.68 33280.89 00:30:36.276 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc1 : 2.03 11874.66 11.60 0.00 0.00 21515.46 4673.00 33280.89 00:30:36.276 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc0 : 2.04 11895.21 11.62 0.00 0.00 21401.77 3818.18 29405.72 00:30:36.276 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc1 : 2.05 11884.17 11.61 0.00 0.00 21402.76 4644.51 29405.72 00:30:36.276 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc0 : 2.05 11873.42 11.60 0.00 0.00 21347.12 3789.69 25530.55 00:30:36.276 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc1 : 2.05 11862.45 11.58 0.00 0.00 21345.06 4673.00 25530.55 00:30:36.276 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc0 : 2.05 11851.75 11.57 0.00 0.00 21286.97 3789.69 23820.91 00:30:36.276 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.276 Malloc1 : 2.05 11840.82 11.56 0.00 0.00 21286.43 4644.51 23820.91 00:30:36.276 =================================================================================================================== 00:30:36.276 Total : 94968.34 92.74 0.00 0.00 21387.34 3789.69 33280.89' 00:30:36.276 22:59:20 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 22:59:17.872720] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:36.276 [2024-07-15 22:59:17.872789] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2868147 ] 00:30:36.276 Using job config with 4 jobs 00:30:36.276 [2024-07-15 22:59:18.020283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.276 [2024-07-15 22:59:18.145057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.276 cpumask for '\''job0'\'' is too big 00:30:36.276 cpumask for '\''job1'\'' is too big 00:30:36.276 cpumask for '\''job2'\'' is too big 00:30:36.277 cpumask for '\''job3'\'' is too big 00:30:36.277 Running I/O for 2 seconds... 00:30:36.277 00:30:36.277 Latency(us) 00:30:36.277 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.02 11885.86 11.61 0.00 0.00 21515.83 3846.68 33280.89 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.03 11874.66 11.60 0.00 0.00 21515.46 4673.00 33280.89 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.04 11895.21 11.62 0.00 0.00 21401.77 3818.18 29405.72 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.05 11884.17 11.61 0.00 0.00 21402.76 4644.51 29405.72 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.05 11873.42 11.60 0.00 0.00 21347.12 3789.69 25530.55 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.05 11862.45 11.58 0.00 0.00 21345.06 4673.00 25530.55 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.05 11851.75 11.57 0.00 0.00 21286.97 3789.69 23820.91 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.05 11840.82 11.56 0.00 0.00 21286.43 4644.51 23820.91 00:30:36.277 =================================================================================================================== 00:30:36.277 Total : 94968.34 92.74 0.00 0.00 21387.34 3789.69 33280.89' 00:30:36.277 22:59:20 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 22:59:17.872720] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:36.277 [2024-07-15 22:59:17.872789] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2868147 ] 00:30:36.277 Using job config with 4 jobs 00:30:36.277 [2024-07-15 22:59:18.020283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:36.277 [2024-07-15 22:59:18.145057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.277 cpumask for '\''job0'\'' is too big 00:30:36.277 cpumask for '\''job1'\'' is too big 00:30:36.277 cpumask for '\''job2'\'' is too big 00:30:36.277 cpumask for '\''job3'\'' is too big 00:30:36.277 Running I/O for 2 seconds... 00:30:36.277 00:30:36.277 Latency(us) 00:30:36.277 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.02 11885.86 11.61 0.00 0.00 21515.83 3846.68 33280.89 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.03 11874.66 11.60 0.00 0.00 21515.46 4673.00 33280.89 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.04 11895.21 11.62 0.00 0.00 21401.77 3818.18 29405.72 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.05 11884.17 11.61 0.00 0.00 21402.76 4644.51 29405.72 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.05 11873.42 11.60 0.00 0.00 21347.12 3789.69 25530.55 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.05 11862.45 11.58 0.00 0.00 21345.06 4673.00 25530.55 00:30:36.277 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc0 : 2.05 11851.75 11.57 0.00 0.00 21286.97 3789.69 23820.91 00:30:36.277 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:36.277 Malloc1 : 2.05 11840.82 11.56 0.00 0.00 21286.43 4644.51 23820.91 00:30:36.277 =================================================================================================================== 00:30:36.277 Total : 94968.34 92.74 0.00 0.00 21387.34 3789.69 33280.89' 00:30:36.277 22:59:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:36.277 22:59:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:36.277 22:59:20 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:30:36.277 22:59:20 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:30:36.277 22:59:20 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:36.277 22:59:20 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:30:36.277 00:30:36.277 real 0m11.547s 00:30:36.277 user 0m10.054s 00:30:36.277 sys 0m1.311s 00:30:36.277 22:59:20 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:36.277 22:59:20 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:30:36.277 ************************************ 00:30:36.277 END TEST bdevperf_config 00:30:36.277 ************************************ 00:30:36.277 22:59:20 -- common/autotest_common.sh@1142 -- # return 0 00:30:36.277 22:59:20 -- spdk/autotest.sh@192 -- # uname -s 00:30:36.277 22:59:20 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:30:36.277 22:59:20 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:36.277 22:59:20 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:36.277 22:59:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:36.277 22:59:20 -- common/autotest_common.sh@10 -- # set +x 00:30:36.277 ************************************ 00:30:36.277 START TEST reactor_set_interrupt 00:30:36.277 ************************************ 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:36.277 * Looking for test storage... 00:30:36.277 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.277 22:59:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:36.277 22:59:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:36.277 22:59:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.277 22:59:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.277 22:59:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:36.277 22:59:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:36.277 22:59:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:36.277 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:36.277 22:59:20 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:36.278 22:59:20 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:36.278 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:36.278 #define SPDK_CONFIG_H 00:30:36.278 #define SPDK_CONFIG_APPS 1 00:30:36.278 #define SPDK_CONFIG_ARCH native 00:30:36.278 #undef SPDK_CONFIG_ASAN 00:30:36.278 #undef SPDK_CONFIG_AVAHI 00:30:36.278 #undef SPDK_CONFIG_CET 00:30:36.278 #define SPDK_CONFIG_COVERAGE 1 00:30:36.278 #define SPDK_CONFIG_CROSS_PREFIX 00:30:36.278 #define SPDK_CONFIG_CRYPTO 1 00:30:36.278 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:36.278 #undef SPDK_CONFIG_CUSTOMOCF 00:30:36.278 #undef SPDK_CONFIG_DAOS 00:30:36.278 #define SPDK_CONFIG_DAOS_DIR 00:30:36.278 #define SPDK_CONFIG_DEBUG 1 00:30:36.278 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:36.278 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:36.278 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:36.278 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:36.278 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:36.278 #undef SPDK_CONFIG_DPDK_UADK 00:30:36.278 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:36.278 #define SPDK_CONFIG_EXAMPLES 1 00:30:36.278 #undef SPDK_CONFIG_FC 00:30:36.278 #define SPDK_CONFIG_FC_PATH 00:30:36.278 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:36.278 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:36.278 #undef SPDK_CONFIG_FUSE 00:30:36.278 #undef SPDK_CONFIG_FUZZER 00:30:36.278 #define SPDK_CONFIG_FUZZER_LIB 00:30:36.278 #undef SPDK_CONFIG_GOLANG 00:30:36.278 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:36.278 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:36.278 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:36.278 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:36.278 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:36.278 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:36.278 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:36.278 #define SPDK_CONFIG_IDXD 1 00:30:36.278 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:36.278 #define SPDK_CONFIG_IPSEC_MB 1 00:30:36.278 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:36.278 #define SPDK_CONFIG_ISAL 1 00:30:36.278 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:36.278 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:36.278 #define SPDK_CONFIG_LIBDIR 00:30:36.278 #undef SPDK_CONFIG_LTO 00:30:36.278 #define SPDK_CONFIG_MAX_LCORES 128 00:30:36.278 #define SPDK_CONFIG_NVME_CUSE 1 00:30:36.278 #undef SPDK_CONFIG_OCF 00:30:36.278 #define SPDK_CONFIG_OCF_PATH 00:30:36.278 #define SPDK_CONFIG_OPENSSL_PATH 00:30:36.278 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:36.278 #define SPDK_CONFIG_PGO_DIR 00:30:36.278 #undef SPDK_CONFIG_PGO_USE 00:30:36.278 #define SPDK_CONFIG_PREFIX /usr/local 00:30:36.278 #undef SPDK_CONFIG_RAID5F 00:30:36.278 #undef SPDK_CONFIG_RBD 00:30:36.278 #define SPDK_CONFIG_RDMA 1 00:30:36.278 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:36.278 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:36.278 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:36.278 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:36.278 #define SPDK_CONFIG_SHARED 1 00:30:36.278 #undef SPDK_CONFIG_SMA 00:30:36.278 #define SPDK_CONFIG_TESTS 1 00:30:36.278 #undef SPDK_CONFIG_TSAN 00:30:36.278 #define SPDK_CONFIG_UBLK 1 00:30:36.278 #define SPDK_CONFIG_UBSAN 1 00:30:36.278 #undef SPDK_CONFIG_UNIT_TESTS 00:30:36.278 #undef SPDK_CONFIG_URING 00:30:36.278 #define SPDK_CONFIG_URING_PATH 00:30:36.278 #undef SPDK_CONFIG_URING_ZNS 00:30:36.278 #undef SPDK_CONFIG_USDT 00:30:36.278 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:36.278 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:36.278 #undef SPDK_CONFIG_VFIO_USER 00:30:36.278 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:36.278 #define SPDK_CONFIG_VHOST 1 00:30:36.278 #define SPDK_CONFIG_VIRTIO 1 00:30:36.278 #undef SPDK_CONFIG_VTUNE 00:30:36.278 #define SPDK_CONFIG_VTUNE_DIR 00:30:36.278 #define SPDK_CONFIG_WERROR 1 00:30:36.278 #define SPDK_CONFIG_WPDK_DIR 00:30:36.278 #undef SPDK_CONFIG_XNVME 00:30:36.278 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:36.278 22:59:20 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:36.278 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:36.278 22:59:20 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:36.278 22:59:20 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:36.278 22:59:20 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:36.278 22:59:20 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:36.278 22:59:20 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:36.279 22:59:20 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:36.279 22:59:20 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:30:36.279 22:59:20 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:36.279 22:59:20 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:30:36.279 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2868566 ]] 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2868566 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.lmC50x 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.lmC50x/tests/interrupt /tmp/spdk.lmC50x 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:30:36.280 22:59:20 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:36.280 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:36.280 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:36.280 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:36.280 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:36.280 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:36.280 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88602537984 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5905977344 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892316672 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9388032 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253360640 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=897024 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:36.281 * Looking for test storage... 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88602537984 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=8120569856 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.281 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2868710 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:36.281 22:59:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2868710 /var/tmp/spdk.sock 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2868710 ']' 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:36.281 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:36.281 22:59:21 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:36.281 [2024-07-15 22:59:21.073598] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:36.282 [2024-07-15 22:59:21.073667] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2868710 ] 00:30:36.544 [2024-07-15 22:59:21.204667] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:36.544 [2024-07-15 22:59:21.306531] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:36.544 [2024-07-15 22:59:21.306631] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:36.544 [2024-07-15 22:59:21.306632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:36.544 [2024-07-15 22:59:21.377810] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:37.482 22:59:22 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:37.482 22:59:22 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:37.482 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:30:37.482 22:59:22 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:37.482 Malloc0 00:30:37.482 Malloc1 00:30:37.482 Malloc2 00:30:37.482 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:30:37.482 22:59:22 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:37.482 22:59:22 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:37.482 22:59:22 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:37.482 5000+0 records in 00:30:37.482 5000+0 records out 00:30:37.482 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0278491 s, 368 MB/s 00:30:37.482 22:59:22 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:37.741 AIO0 00:30:37.741 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2868710 00:30:37.741 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2868710 without_thd 00:30:37.741 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2868710 00:30:37.741 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:30:37.741 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:38.000 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:38.000 22:59:22 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:38.000 22:59:22 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:38.000 22:59:22 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:38.000 22:59:22 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:38.000 22:59:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:38.000 22:59:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:38.257 22:59:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:38.516 spdk_thread ids are 1 on reactor0. 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2868710 0 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2868710 0 idle 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2868710 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2868710 -w 256 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2868710 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:00.40 reactor_0' 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2868710 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:00.40 reactor_0 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2868710 1 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2868710 1 idle 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2868710 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:38.516 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2868710 -w 256 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2868744 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2868744 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2868710 2 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2868710 2 idle 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2868710 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2868710 -w 256 00:30:38.775 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2868745 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2868745 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:30:39.035 22:59:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:30:39.602 [2024-07-15 22:59:24.239724] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:39.602 22:59:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:39.602 [2024-07-15 22:59:24.503349] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:39.602 [2024-07-15 22:59:24.503788] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:39.861 [2024-07-15 22:59:24.747272] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:39.861 [2024-07-15 22:59:24.747515] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2868710 0 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2868710 0 busy 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2868710 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:39.861 22:59:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2868710 -w 256 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2868710 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0' 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2868710 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.83 reactor_0 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2868710 2 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2868710 2 busy 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2868710 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2868710 -w 256 00:30:40.120 22:59:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2868745 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2' 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2868745 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:40.378 22:59:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:40.636 [2024-07-15 22:59:25.351268] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:40.636 [2024-07-15 22:59:25.351418] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2868710 2 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2868710 2 idle 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2868710 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2868710 -w 256 00:30:40.636 22:59:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2868745 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2868745 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:40.894 22:59:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:40.894 [2024-07-15 22:59:25.799262] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:40.894 [2024-07-15 22:59:25.799412] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:41.153 22:59:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:30:41.153 22:59:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:30:41.153 22:59:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:30:41.153 [2024-07-15 22:59:26.059496] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2868710 0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2868710 0 idle 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2868710 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2868710 -w 256 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2868710 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0' 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2868710 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.70 reactor_0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:30:41.412 22:59:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2868710 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2868710 ']' 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2868710 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2868710 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2868710' 00:30:41.412 killing process with pid 2868710 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2868710 00:30:41.412 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2868710 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2869482 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:41.979 22:59:26 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2869482 /var/tmp/spdk.sock 00:30:41.979 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2869482 ']' 00:30:41.979 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:41.979 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:41.979 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:41.979 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:41.979 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:41.979 22:59:26 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:41.979 [2024-07-15 22:59:26.636138] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:41.979 [2024-07-15 22:59:26.636211] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2869482 ] 00:30:41.979 [2024-07-15 22:59:26.765442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:41.979 [2024-07-15 22:59:26.866969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:41.979 [2024-07-15 22:59:26.867069] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:41.979 [2024-07-15 22:59:26.867072] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:42.238 [2024-07-15 22:59:26.938259] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:42.830 22:59:27 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:42.830 22:59:27 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:42.830 22:59:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:30:42.830 22:59:27 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:43.087 Malloc0 00:30:43.087 Malloc1 00:30:43.087 Malloc2 00:30:43.087 22:59:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:30:43.087 22:59:27 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:43.087 22:59:27 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:43.087 22:59:27 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:43.087 5000+0 records in 00:30:43.087 5000+0 records out 00:30:43.087 10240000 bytes (10 MB, 9.8 MiB) copied, 0.027455 s, 373 MB/s 00:30:43.087 22:59:27 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:43.345 AIO0 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2869482 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2869482 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2869482 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:43.345 22:59:28 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:43.346 22:59:28 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:43.346 22:59:28 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:43.346 22:59:28 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:43.604 22:59:28 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:43.862 spdk_thread ids are 1 on reactor0. 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2869482 0 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2869482 0 idle 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2869482 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:43.862 22:59:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:43.863 22:59:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:43.863 22:59:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:43.863 22:59:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:43.863 22:59:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:43.863 22:59:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:43.863 22:59:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2869482 -w 256 00:30:43.863 22:59:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2869482 root 20 0 128.2g 36288 23040 S 6.7 0.0 0:00.41 reactor_0' 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2869482 root 20 0 128.2g 36288 23040 S 6.7 0.0 0:00.41 reactor_0 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2869482 1 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2869482 1 idle 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2869482 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2869482 -w 256 00:30:44.121 22:59:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2869523 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1' 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2869523 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2869482 2 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2869482 2 idle 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2869482 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2869482 -w 256 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2869524 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2' 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2869524 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:30:44.435 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:44.694 [2024-07-15 22:59:29.491800] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:44.694 [2024-07-15 22:59:29.492045] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:30:44.694 [2024-07-15 22:59:29.494004] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:44.694 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:44.952 [2024-07-15 22:59:29.748333] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:44.952 [2024-07-15 22:59:29.748476] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2869482 0 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2869482 0 busy 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2869482 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2869482 -w 256 00:30:44.952 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2869482 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.85 reactor_0' 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2869482 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.85 reactor_0 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2869482 2 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2869482 2 busy 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2869482 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2869482 -w 256 00:30:45.211 22:59:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:45.211 22:59:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2869524 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2' 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2869524 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:45.471 [2024-07-15 22:59:30.358102] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:45.471 [2024-07-15 22:59:30.358249] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2869482 2 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2869482 2 idle 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2869482 00:30:45.471 22:59:30 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2869482 -w 256 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2869524 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2' 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2869524 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:45.795 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:46.054 [2024-07-15 22:59:30.787194] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:46.054 [2024-07-15 22:59:30.787406] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:30:46.054 [2024-07-15 22:59:30.787431] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2869482 0 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2869482 0 idle 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2869482 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2869482 -w 256 00:30:46.054 22:59:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2869482 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.70 reactor_0' 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2869482 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.70 reactor_0 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:30:46.312 22:59:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2869482 00:30:46.312 22:59:30 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2869482 ']' 00:30:46.312 22:59:30 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2869482 00:30:46.312 22:59:30 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:46.312 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:46.312 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2869482 00:30:46.312 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:46.312 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:46.312 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2869482' 00:30:46.312 killing process with pid 2869482 00:30:46.312 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2869482 00:30:46.312 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2869482 00:30:46.572 22:59:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:30:46.572 22:59:31 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:46.572 00:30:46.572 real 0m10.592s 00:30:46.572 user 0m9.980s 00:30:46.572 sys 0m2.271s 00:30:46.572 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:46.572 22:59:31 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:46.572 ************************************ 00:30:46.572 END TEST reactor_set_interrupt 00:30:46.572 ************************************ 00:30:46.572 22:59:31 -- common/autotest_common.sh@1142 -- # return 0 00:30:46.572 22:59:31 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:46.572 22:59:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:46.572 22:59:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:46.572 22:59:31 -- common/autotest_common.sh@10 -- # set +x 00:30:46.572 ************************************ 00:30:46.572 START TEST reap_unregistered_poller 00:30:46.572 ************************************ 00:30:46.572 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:46.833 * Looking for test storage... 00:30:46.833 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.833 22:59:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:46.833 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:46.833 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.833 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.833 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:46.833 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:46.833 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:46.833 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:46.833 22:59:31 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:46.834 22:59:31 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:46.834 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:46.834 22:59:31 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:46.834 #define SPDK_CONFIG_H 00:30:46.834 #define SPDK_CONFIG_APPS 1 00:30:46.834 #define SPDK_CONFIG_ARCH native 00:30:46.834 #undef SPDK_CONFIG_ASAN 00:30:46.834 #undef SPDK_CONFIG_AVAHI 00:30:46.834 #undef SPDK_CONFIG_CET 00:30:46.834 #define SPDK_CONFIG_COVERAGE 1 00:30:46.834 #define SPDK_CONFIG_CROSS_PREFIX 00:30:46.834 #define SPDK_CONFIG_CRYPTO 1 00:30:46.834 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:46.834 #undef SPDK_CONFIG_CUSTOMOCF 00:30:46.834 #undef SPDK_CONFIG_DAOS 00:30:46.834 #define SPDK_CONFIG_DAOS_DIR 00:30:46.834 #define SPDK_CONFIG_DEBUG 1 00:30:46.834 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:46.834 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:46.834 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:46.834 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:46.834 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:46.834 #undef SPDK_CONFIG_DPDK_UADK 00:30:46.834 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:46.834 #define SPDK_CONFIG_EXAMPLES 1 00:30:46.834 #undef SPDK_CONFIG_FC 00:30:46.834 #define SPDK_CONFIG_FC_PATH 00:30:46.834 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:46.834 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:46.834 #undef SPDK_CONFIG_FUSE 00:30:46.834 #undef SPDK_CONFIG_FUZZER 00:30:46.834 #define SPDK_CONFIG_FUZZER_LIB 00:30:46.834 #undef SPDK_CONFIG_GOLANG 00:30:46.834 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:46.834 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:46.834 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:46.834 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:46.834 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:46.834 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:46.834 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:46.834 #define SPDK_CONFIG_IDXD 1 00:30:46.834 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:46.834 #define SPDK_CONFIG_IPSEC_MB 1 00:30:46.834 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:46.834 #define SPDK_CONFIG_ISAL 1 00:30:46.834 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:46.834 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:46.834 #define SPDK_CONFIG_LIBDIR 00:30:46.834 #undef SPDK_CONFIG_LTO 00:30:46.835 #define SPDK_CONFIG_MAX_LCORES 128 00:30:46.835 #define SPDK_CONFIG_NVME_CUSE 1 00:30:46.835 #undef SPDK_CONFIG_OCF 00:30:46.835 #define SPDK_CONFIG_OCF_PATH 00:30:46.835 #define SPDK_CONFIG_OPENSSL_PATH 00:30:46.835 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:46.835 #define SPDK_CONFIG_PGO_DIR 00:30:46.835 #undef SPDK_CONFIG_PGO_USE 00:30:46.835 #define SPDK_CONFIG_PREFIX /usr/local 00:30:46.835 #undef SPDK_CONFIG_RAID5F 00:30:46.835 #undef SPDK_CONFIG_RBD 00:30:46.835 #define SPDK_CONFIG_RDMA 1 00:30:46.835 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:46.835 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:46.835 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:46.835 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:46.835 #define SPDK_CONFIG_SHARED 1 00:30:46.835 #undef SPDK_CONFIG_SMA 00:30:46.835 #define SPDK_CONFIG_TESTS 1 00:30:46.835 #undef SPDK_CONFIG_TSAN 00:30:46.835 #define SPDK_CONFIG_UBLK 1 00:30:46.835 #define SPDK_CONFIG_UBSAN 1 00:30:46.835 #undef SPDK_CONFIG_UNIT_TESTS 00:30:46.835 #undef SPDK_CONFIG_URING 00:30:46.835 #define SPDK_CONFIG_URING_PATH 00:30:46.835 #undef SPDK_CONFIG_URING_ZNS 00:30:46.835 #undef SPDK_CONFIG_USDT 00:30:46.835 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:46.835 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:46.835 #undef SPDK_CONFIG_VFIO_USER 00:30:46.835 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:46.835 #define SPDK_CONFIG_VHOST 1 00:30:46.835 #define SPDK_CONFIG_VIRTIO 1 00:30:46.835 #undef SPDK_CONFIG_VTUNE 00:30:46.835 #define SPDK_CONFIG_VTUNE_DIR 00:30:46.835 #define SPDK_CONFIG_WERROR 1 00:30:46.835 #define SPDK_CONFIG_WPDK_DIR 00:30:46.835 #undef SPDK_CONFIG_XNVME 00:30:46.835 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:46.835 22:59:31 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:46.835 22:59:31 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:46.835 22:59:31 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:46.835 22:59:31 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:46.835 22:59:31 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:30:46.835 22:59:31 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:46.835 22:59:31 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:30:46.835 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:46.836 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2870153 ]] 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2870153 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.JNidjd 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.JNidjd/tests/interrupt /tmp/spdk.JNidjd 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88602382336 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5906132992 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892316672 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9388032 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253360640 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=897024 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:46.837 * Looking for test storage... 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.837 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88602382336 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=8120725504 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.838 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2870203 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:46.838 22:59:31 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2870203 /var/tmp/spdk.sock 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 2870203 ']' 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:46.838 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:46.838 22:59:31 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:46.838 [2024-07-15 22:59:31.733061] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:46.838 [2024-07-15 22:59:31.733129] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2870203 ] 00:30:47.097 [2024-07-15 22:59:31.864111] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:47.097 [2024-07-15 22:59:31.967125] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:47.097 [2024-07-15 22:59:31.967224] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:47.097 [2024-07-15 22:59:31.967226] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:47.356 [2024-07-15 22:59:32.038453] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:47.926 22:59:32 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:47.926 22:59:32 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:30:47.926 22:59:32 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:47.926 22:59:32 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:47.926 22:59:32 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:30:47.926 "name": "app_thread", 00:30:47.926 "id": 1, 00:30:47.926 "active_pollers": [], 00:30:47.926 "timed_pollers": [ 00:30:47.926 { 00:30:47.926 "name": "rpc_subsystem_poll_servers", 00:30:47.926 "id": 1, 00:30:47.926 "state": "waiting", 00:30:47.926 "run_count": 0, 00:30:47.926 "busy_count": 0, 00:30:47.926 "period_ticks": 9200000 00:30:47.926 } 00:30:47.926 ], 00:30:47.926 "paused_pollers": [] 00:30:47.926 }' 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:30:47.926 22:59:32 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:30:48.185 22:59:32 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:30:48.185 22:59:32 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:48.185 22:59:32 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:48.185 5000+0 records in 00:30:48.185 5000+0 records out 00:30:48.185 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0199775 s, 513 MB/s 00:30:48.185 22:59:32 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:48.444 AIO0 00:30:48.444 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:48.703 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:30:48.703 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:30:48.703 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:48.703 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:48.703 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:30:48.703 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:48.703 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:30:48.703 "name": "app_thread", 00:30:48.703 "id": 1, 00:30:48.703 "active_pollers": [], 00:30:48.703 "timed_pollers": [ 00:30:48.703 { 00:30:48.703 "name": "rpc_subsystem_poll_servers", 00:30:48.703 "id": 1, 00:30:48.703 "state": "waiting", 00:30:48.703 "run_count": 0, 00:30:48.703 "busy_count": 0, 00:30:48.703 "period_ticks": 9200000 00:30:48.703 } 00:30:48.703 ], 00:30:48.703 "paused_pollers": [] 00:30:48.703 }' 00:30:48.703 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:30:48.703 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:30:48.703 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:30:48.963 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:30:48.963 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:30:48.963 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:30:48.963 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:30:48.963 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2870203 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 2870203 ']' 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 2870203 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2870203 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2870203' 00:30:48.963 killing process with pid 2870203 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 2870203 00:30:48.963 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 2870203 00:30:49.222 22:59:33 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:30:49.222 22:59:33 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:49.222 00:30:49.222 real 0m2.527s 00:30:49.222 user 0m1.592s 00:30:49.222 sys 0m0.697s 00:30:49.222 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:49.222 22:59:33 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:49.222 ************************************ 00:30:49.222 END TEST reap_unregistered_poller 00:30:49.222 ************************************ 00:30:49.222 22:59:33 -- common/autotest_common.sh@1142 -- # return 0 00:30:49.222 22:59:33 -- spdk/autotest.sh@198 -- # uname -s 00:30:49.222 22:59:33 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:30:49.222 22:59:33 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:30:49.222 22:59:33 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:30:49.222 22:59:33 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:33 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:33 -- spdk/autotest.sh@260 -- # timing_exit lib 00:30:49.222 22:59:33 -- common/autotest_common.sh@728 -- # xtrace_disable 00:30:49.222 22:59:33 -- common/autotest_common.sh@10 -- # set +x 00:30:49.222 22:59:34 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:30:49.222 22:59:34 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:49.222 22:59:34 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:49.222 22:59:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:49.222 22:59:34 -- common/autotest_common.sh@10 -- # set +x 00:30:49.222 ************************************ 00:30:49.222 START TEST compress_compdev 00:30:49.222 ************************************ 00:30:49.222 22:59:34 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:49.481 * Looking for test storage... 00:30:49.481 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:49.481 22:59:34 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:49.481 22:59:34 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:49.481 22:59:34 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:49.481 22:59:34 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.481 22:59:34 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.481 22:59:34 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.481 22:59:34 compress_compdev -- paths/export.sh@5 -- # export PATH 00:30:49.481 22:59:34 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:49.481 22:59:34 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2870641 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2870641 00:30:49.481 22:59:34 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2870641 ']' 00:30:49.481 22:59:34 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:49.481 22:59:34 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:49.481 22:59:34 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:49.481 22:59:34 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:49.481 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:49.481 22:59:34 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:49.481 22:59:34 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:49.481 [2024-07-15 22:59:34.290608] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:30:49.481 [2024-07-15 22:59:34.290683] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2870641 ] 00:30:49.741 [2024-07-15 22:59:34.425649] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:49.741 [2024-07-15 22:59:34.543653] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:49.741 [2024-07-15 22:59:34.543656] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:50.679 [2024-07-15 22:59:35.504334] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:50.679 22:59:35 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:50.679 22:59:35 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:50.679 22:59:35 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:30:50.679 22:59:35 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:50.679 22:59:35 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:51.617 [2024-07-15 22:59:36.168518] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e9d3c0 PMD being used: compress_qat 00:30:51.617 22:59:36 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:51.617 22:59:36 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:51.617 22:59:36 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:51.617 22:59:36 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:51.617 22:59:36 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:51.617 22:59:36 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:51.617 22:59:36 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:51.617 22:59:36 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:51.876 [ 00:30:51.876 { 00:30:51.876 "name": "Nvme0n1", 00:30:51.876 "aliases": [ 00:30:51.876 "01000000-0000-0000-5cd2-e43197705251" 00:30:51.876 ], 00:30:51.876 "product_name": "NVMe disk", 00:30:51.876 "block_size": 512, 00:30:51.876 "num_blocks": 15002931888, 00:30:51.876 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:51.876 "assigned_rate_limits": { 00:30:51.876 "rw_ios_per_sec": 0, 00:30:51.876 "rw_mbytes_per_sec": 0, 00:30:51.876 "r_mbytes_per_sec": 0, 00:30:51.876 "w_mbytes_per_sec": 0 00:30:51.876 }, 00:30:51.876 "claimed": false, 00:30:51.876 "zoned": false, 00:30:51.876 "supported_io_types": { 00:30:51.876 "read": true, 00:30:51.876 "write": true, 00:30:51.876 "unmap": true, 00:30:51.876 "flush": true, 00:30:51.876 "reset": true, 00:30:51.876 "nvme_admin": true, 00:30:51.876 "nvme_io": true, 00:30:51.876 "nvme_io_md": false, 00:30:51.876 "write_zeroes": true, 00:30:51.876 "zcopy": false, 00:30:51.876 "get_zone_info": false, 00:30:51.876 "zone_management": false, 00:30:51.876 "zone_append": false, 00:30:51.876 "compare": false, 00:30:51.876 "compare_and_write": false, 00:30:51.876 "abort": true, 00:30:51.876 "seek_hole": false, 00:30:51.876 "seek_data": false, 00:30:51.876 "copy": false, 00:30:51.876 "nvme_iov_md": false 00:30:51.876 }, 00:30:51.876 "driver_specific": { 00:30:51.876 "nvme": [ 00:30:51.876 { 00:30:51.876 "pci_address": "0000:5e:00.0", 00:30:51.876 "trid": { 00:30:51.876 "trtype": "PCIe", 00:30:51.876 "traddr": "0000:5e:00.0" 00:30:51.876 }, 00:30:51.876 "ctrlr_data": { 00:30:51.876 "cntlid": 0, 00:30:51.876 "vendor_id": "0x8086", 00:30:51.876 "model_number": "INTEL SSDPF2KX076TZO", 00:30:51.876 "serial_number": "PHAC0301002G7P6CGN", 00:30:51.876 "firmware_revision": "JCV10200", 00:30:51.876 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:51.876 "oacs": { 00:30:51.876 "security": 1, 00:30:51.876 "format": 1, 00:30:51.876 "firmware": 1, 00:30:51.876 "ns_manage": 1 00:30:51.876 }, 00:30:51.876 "multi_ctrlr": false, 00:30:51.876 "ana_reporting": false 00:30:51.876 }, 00:30:51.876 "vs": { 00:30:51.876 "nvme_version": "1.3" 00:30:51.876 }, 00:30:51.876 "ns_data": { 00:30:51.876 "id": 1, 00:30:51.876 "can_share": false 00:30:51.876 }, 00:30:51.876 "security": { 00:30:51.877 "opal": true 00:30:51.877 } 00:30:51.877 } 00:30:51.877 ], 00:30:51.877 "mp_policy": "active_passive" 00:30:51.877 } 00:30:51.877 } 00:30:51.877 ] 00:30:51.877 22:59:36 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:51.877 22:59:36 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:52.445 [2024-07-15 22:59:37.204386] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1d020d0 PMD being used: compress_qat 00:30:54.979 50372f88-8961-4692-b823-02f16ce8e4fd 00:30:54.979 22:59:39 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:54.979 d1c84428-e8e4-4625-821a-607c4f8e17a5 00:30:54.979 22:59:39 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:54.979 22:59:39 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:54.979 22:59:39 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:54.979 22:59:39 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:54.979 22:59:39 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:54.979 22:59:39 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:54.979 22:59:39 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:55.238 22:59:39 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:55.238 [ 00:30:55.238 { 00:30:55.238 "name": "d1c84428-e8e4-4625-821a-607c4f8e17a5", 00:30:55.238 "aliases": [ 00:30:55.238 "lvs0/lv0" 00:30:55.238 ], 00:30:55.238 "product_name": "Logical Volume", 00:30:55.238 "block_size": 512, 00:30:55.238 "num_blocks": 204800, 00:30:55.238 "uuid": "d1c84428-e8e4-4625-821a-607c4f8e17a5", 00:30:55.238 "assigned_rate_limits": { 00:30:55.238 "rw_ios_per_sec": 0, 00:30:55.238 "rw_mbytes_per_sec": 0, 00:30:55.238 "r_mbytes_per_sec": 0, 00:30:55.238 "w_mbytes_per_sec": 0 00:30:55.238 }, 00:30:55.238 "claimed": false, 00:30:55.238 "zoned": false, 00:30:55.238 "supported_io_types": { 00:30:55.238 "read": true, 00:30:55.238 "write": true, 00:30:55.238 "unmap": true, 00:30:55.238 "flush": false, 00:30:55.238 "reset": true, 00:30:55.238 "nvme_admin": false, 00:30:55.238 "nvme_io": false, 00:30:55.238 "nvme_io_md": false, 00:30:55.238 "write_zeroes": true, 00:30:55.238 "zcopy": false, 00:30:55.238 "get_zone_info": false, 00:30:55.238 "zone_management": false, 00:30:55.238 "zone_append": false, 00:30:55.238 "compare": false, 00:30:55.238 "compare_and_write": false, 00:30:55.238 "abort": false, 00:30:55.238 "seek_hole": true, 00:30:55.238 "seek_data": true, 00:30:55.238 "copy": false, 00:30:55.238 "nvme_iov_md": false 00:30:55.238 }, 00:30:55.238 "driver_specific": { 00:30:55.238 "lvol": { 00:30:55.238 "lvol_store_uuid": "50372f88-8961-4692-b823-02f16ce8e4fd", 00:30:55.238 "base_bdev": "Nvme0n1", 00:30:55.238 "thin_provision": true, 00:30:55.238 "num_allocated_clusters": 0, 00:30:55.238 "snapshot": false, 00:30:55.238 "clone": false, 00:30:55.238 "esnap_clone": false 00:30:55.238 } 00:30:55.238 } 00:30:55.238 } 00:30:55.238 ] 00:30:55.497 22:59:40 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:55.497 22:59:40 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:55.497 22:59:40 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:55.497 [2024-07-15 22:59:40.389217] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:55.497 COMP_lvs0/lv0 00:30:55.757 22:59:40 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:55.757 22:59:40 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:55.757 22:59:40 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:55.757 22:59:40 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:55.757 22:59:40 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:55.757 22:59:40 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:55.757 22:59:40 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:56.015 22:59:40 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:56.015 [ 00:30:56.015 { 00:30:56.015 "name": "COMP_lvs0/lv0", 00:30:56.015 "aliases": [ 00:30:56.015 "e7c03035-cb18-511c-820a-015c59112d78" 00:30:56.015 ], 00:30:56.015 "product_name": "compress", 00:30:56.015 "block_size": 512, 00:30:56.015 "num_blocks": 200704, 00:30:56.015 "uuid": "e7c03035-cb18-511c-820a-015c59112d78", 00:30:56.015 "assigned_rate_limits": { 00:30:56.016 "rw_ios_per_sec": 0, 00:30:56.016 "rw_mbytes_per_sec": 0, 00:30:56.016 "r_mbytes_per_sec": 0, 00:30:56.016 "w_mbytes_per_sec": 0 00:30:56.016 }, 00:30:56.016 "claimed": false, 00:30:56.016 "zoned": false, 00:30:56.016 "supported_io_types": { 00:30:56.016 "read": true, 00:30:56.016 "write": true, 00:30:56.016 "unmap": false, 00:30:56.016 "flush": false, 00:30:56.016 "reset": false, 00:30:56.016 "nvme_admin": false, 00:30:56.016 "nvme_io": false, 00:30:56.016 "nvme_io_md": false, 00:30:56.016 "write_zeroes": true, 00:30:56.016 "zcopy": false, 00:30:56.016 "get_zone_info": false, 00:30:56.016 "zone_management": false, 00:30:56.016 "zone_append": false, 00:30:56.016 "compare": false, 00:30:56.016 "compare_and_write": false, 00:30:56.016 "abort": false, 00:30:56.016 "seek_hole": false, 00:30:56.016 "seek_data": false, 00:30:56.016 "copy": false, 00:30:56.016 "nvme_iov_md": false 00:30:56.016 }, 00:30:56.016 "driver_specific": { 00:30:56.016 "compress": { 00:30:56.016 "name": "COMP_lvs0/lv0", 00:30:56.016 "base_bdev_name": "d1c84428-e8e4-4625-821a-607c4f8e17a5" 00:30:56.016 } 00:30:56.016 } 00:30:56.016 } 00:30:56.016 ] 00:30:56.016 22:59:40 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:56.016 22:59:40 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:56.274 [2024-07-15 22:59:41.027497] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f7c5c1b15c0 PMD being used: compress_qat 00:30:56.274 [2024-07-15 22:59:41.030699] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1e9a6e0 PMD being used: compress_qat 00:30:56.274 Running I/O for 3 seconds... 00:30:59.587 00:30:59.587 Latency(us) 00:30:59.587 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:59.587 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:59.587 Verification LBA range: start 0x0 length 0x3100 00:30:59.587 COMP_lvs0/lv0 : 3.01 1674.85 6.54 0.00 0.00 19015.77 1716.76 17438.27 00:30:59.587 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:59.587 Verification LBA range: start 0x3100 length 0x3100 00:30:59.587 COMP_lvs0/lv0 : 3.01 1775.78 6.94 0.00 0.00 17904.83 1317.84 14531.90 00:30:59.587 =================================================================================================================== 00:30:59.587 Total : 3450.62 13.48 0.00 0.00 18444.04 1317.84 17438.27 00:30:59.587 0 00:30:59.587 22:59:44 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:59.587 22:59:44 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:59.587 22:59:44 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:59.846 22:59:44 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:59.846 22:59:44 compress_compdev -- compress/compress.sh@78 -- # killprocess 2870641 00:30:59.846 22:59:44 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2870641 ']' 00:30:59.846 22:59:44 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2870641 00:30:59.846 22:59:44 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:59.847 22:59:44 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:59.847 22:59:44 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2870641 00:30:59.847 22:59:44 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:59.847 22:59:44 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:59.847 22:59:44 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2870641' 00:30:59.847 killing process with pid 2870641 00:30:59.847 22:59:44 compress_compdev -- common/autotest_common.sh@967 -- # kill 2870641 00:30:59.847 Received shutdown signal, test time was about 3.000000 seconds 00:30:59.847 00:30:59.847 Latency(us) 00:30:59.847 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:59.847 =================================================================================================================== 00:30:59.847 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:59.847 22:59:44 compress_compdev -- common/autotest_common.sh@972 -- # wait 2870641 00:31:03.137 22:59:47 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:03.137 22:59:47 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:03.137 22:59:47 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2872407 00:31:03.137 22:59:47 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:03.137 22:59:47 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:03.137 22:59:47 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2872407 00:31:03.137 22:59:47 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2872407 ']' 00:31:03.137 22:59:47 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:03.137 22:59:47 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:03.137 22:59:47 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:03.137 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:03.137 22:59:47 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:03.137 22:59:47 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:03.137 [2024-07-15 22:59:47.734130] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:31:03.137 [2024-07-15 22:59:47.734204] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2872407 ] 00:31:03.137 [2024-07-15 22:59:47.868912] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:03.137 [2024-07-15 22:59:47.986246] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:03.137 [2024-07-15 22:59:47.986253] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:04.074 [2024-07-15 22:59:48.951739] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:04.332 22:59:49 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:04.332 22:59:49 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:04.332 22:59:49 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:31:04.332 22:59:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:04.332 22:59:49 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:04.899 [2024-07-15 22:59:49.632395] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1cb43c0 PMD being used: compress_qat 00:31:04.899 22:59:49 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:04.899 22:59:49 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:04.899 22:59:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:04.899 22:59:49 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:04.899 22:59:49 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:04.899 22:59:49 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:04.899 22:59:49 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:05.158 22:59:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:05.416 [ 00:31:05.416 { 00:31:05.416 "name": "Nvme0n1", 00:31:05.416 "aliases": [ 00:31:05.416 "01000000-0000-0000-5cd2-e43197705251" 00:31:05.416 ], 00:31:05.416 "product_name": "NVMe disk", 00:31:05.416 "block_size": 512, 00:31:05.416 "num_blocks": 15002931888, 00:31:05.416 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:05.416 "assigned_rate_limits": { 00:31:05.416 "rw_ios_per_sec": 0, 00:31:05.416 "rw_mbytes_per_sec": 0, 00:31:05.416 "r_mbytes_per_sec": 0, 00:31:05.416 "w_mbytes_per_sec": 0 00:31:05.416 }, 00:31:05.416 "claimed": false, 00:31:05.416 "zoned": false, 00:31:05.416 "supported_io_types": { 00:31:05.416 "read": true, 00:31:05.416 "write": true, 00:31:05.416 "unmap": true, 00:31:05.416 "flush": true, 00:31:05.416 "reset": true, 00:31:05.416 "nvme_admin": true, 00:31:05.416 "nvme_io": true, 00:31:05.416 "nvme_io_md": false, 00:31:05.416 "write_zeroes": true, 00:31:05.416 "zcopy": false, 00:31:05.416 "get_zone_info": false, 00:31:05.416 "zone_management": false, 00:31:05.416 "zone_append": false, 00:31:05.416 "compare": false, 00:31:05.416 "compare_and_write": false, 00:31:05.416 "abort": true, 00:31:05.416 "seek_hole": false, 00:31:05.416 "seek_data": false, 00:31:05.416 "copy": false, 00:31:05.416 "nvme_iov_md": false 00:31:05.416 }, 00:31:05.416 "driver_specific": { 00:31:05.416 "nvme": [ 00:31:05.416 { 00:31:05.416 "pci_address": "0000:5e:00.0", 00:31:05.416 "trid": { 00:31:05.416 "trtype": "PCIe", 00:31:05.416 "traddr": "0000:5e:00.0" 00:31:05.416 }, 00:31:05.416 "ctrlr_data": { 00:31:05.416 "cntlid": 0, 00:31:05.416 "vendor_id": "0x8086", 00:31:05.416 "model_number": "INTEL SSDPF2KX076TZO", 00:31:05.416 "serial_number": "PHAC0301002G7P6CGN", 00:31:05.416 "firmware_revision": "JCV10200", 00:31:05.416 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:05.416 "oacs": { 00:31:05.416 "security": 1, 00:31:05.416 "format": 1, 00:31:05.416 "firmware": 1, 00:31:05.416 "ns_manage": 1 00:31:05.416 }, 00:31:05.416 "multi_ctrlr": false, 00:31:05.416 "ana_reporting": false 00:31:05.416 }, 00:31:05.416 "vs": { 00:31:05.416 "nvme_version": "1.3" 00:31:05.416 }, 00:31:05.416 "ns_data": { 00:31:05.416 "id": 1, 00:31:05.416 "can_share": false 00:31:05.416 }, 00:31:05.416 "security": { 00:31:05.416 "opal": true 00:31:05.416 } 00:31:05.416 } 00:31:05.416 ], 00:31:05.416 "mp_policy": "active_passive" 00:31:05.416 } 00:31:05.416 } 00:31:05.416 ] 00:31:05.416 22:59:50 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:05.416 22:59:50 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:05.673 [2024-07-15 22:59:50.410947] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b190d0 PMD being used: compress_qat 00:31:08.254 b1748714-4ca7-4fc7-a290-21308b663227 00:31:08.254 22:59:52 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:08.254 17facace-4521-4b66-8814-5e3dada0f671 00:31:08.254 22:59:52 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:08.254 22:59:52 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:08.254 22:59:52 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:08.254 22:59:52 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:08.254 22:59:52 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:08.254 22:59:52 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:08.254 22:59:52 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:08.254 22:59:53 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:08.513 [ 00:31:08.513 { 00:31:08.514 "name": "17facace-4521-4b66-8814-5e3dada0f671", 00:31:08.514 "aliases": [ 00:31:08.514 "lvs0/lv0" 00:31:08.514 ], 00:31:08.514 "product_name": "Logical Volume", 00:31:08.514 "block_size": 512, 00:31:08.514 "num_blocks": 204800, 00:31:08.514 "uuid": "17facace-4521-4b66-8814-5e3dada0f671", 00:31:08.514 "assigned_rate_limits": { 00:31:08.514 "rw_ios_per_sec": 0, 00:31:08.514 "rw_mbytes_per_sec": 0, 00:31:08.514 "r_mbytes_per_sec": 0, 00:31:08.514 "w_mbytes_per_sec": 0 00:31:08.514 }, 00:31:08.514 "claimed": false, 00:31:08.514 "zoned": false, 00:31:08.514 "supported_io_types": { 00:31:08.514 "read": true, 00:31:08.514 "write": true, 00:31:08.514 "unmap": true, 00:31:08.514 "flush": false, 00:31:08.514 "reset": true, 00:31:08.514 "nvme_admin": false, 00:31:08.514 "nvme_io": false, 00:31:08.514 "nvme_io_md": false, 00:31:08.514 "write_zeroes": true, 00:31:08.514 "zcopy": false, 00:31:08.514 "get_zone_info": false, 00:31:08.514 "zone_management": false, 00:31:08.514 "zone_append": false, 00:31:08.514 "compare": false, 00:31:08.514 "compare_and_write": false, 00:31:08.514 "abort": false, 00:31:08.514 "seek_hole": true, 00:31:08.514 "seek_data": true, 00:31:08.514 "copy": false, 00:31:08.514 "nvme_iov_md": false 00:31:08.514 }, 00:31:08.514 "driver_specific": { 00:31:08.514 "lvol": { 00:31:08.514 "lvol_store_uuid": "b1748714-4ca7-4fc7-a290-21308b663227", 00:31:08.514 "base_bdev": "Nvme0n1", 00:31:08.514 "thin_provision": true, 00:31:08.514 "num_allocated_clusters": 0, 00:31:08.514 "snapshot": false, 00:31:08.514 "clone": false, 00:31:08.514 "esnap_clone": false 00:31:08.514 } 00:31:08.514 } 00:31:08.514 } 00:31:08.514 ] 00:31:08.514 22:59:53 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:08.514 22:59:53 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:08.514 22:59:53 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:08.772 [2024-07-15 22:59:53.652005] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:08.772 COMP_lvs0/lv0 00:31:09.030 22:59:53 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:09.030 22:59:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:09.030 22:59:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:09.030 22:59:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:09.030 22:59:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:09.030 22:59:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:09.030 22:59:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:09.030 22:59:53 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:09.289 [ 00:31:09.289 { 00:31:09.289 "name": "COMP_lvs0/lv0", 00:31:09.289 "aliases": [ 00:31:09.289 "9c7180d5-a933-5c3e-aaa0-661b1a41994d" 00:31:09.289 ], 00:31:09.289 "product_name": "compress", 00:31:09.289 "block_size": 512, 00:31:09.289 "num_blocks": 200704, 00:31:09.289 "uuid": "9c7180d5-a933-5c3e-aaa0-661b1a41994d", 00:31:09.289 "assigned_rate_limits": { 00:31:09.289 "rw_ios_per_sec": 0, 00:31:09.289 "rw_mbytes_per_sec": 0, 00:31:09.289 "r_mbytes_per_sec": 0, 00:31:09.289 "w_mbytes_per_sec": 0 00:31:09.289 }, 00:31:09.289 "claimed": false, 00:31:09.289 "zoned": false, 00:31:09.289 "supported_io_types": { 00:31:09.289 "read": true, 00:31:09.289 "write": true, 00:31:09.289 "unmap": false, 00:31:09.289 "flush": false, 00:31:09.289 "reset": false, 00:31:09.289 "nvme_admin": false, 00:31:09.289 "nvme_io": false, 00:31:09.289 "nvme_io_md": false, 00:31:09.289 "write_zeroes": true, 00:31:09.289 "zcopy": false, 00:31:09.289 "get_zone_info": false, 00:31:09.289 "zone_management": false, 00:31:09.289 "zone_append": false, 00:31:09.289 "compare": false, 00:31:09.289 "compare_and_write": false, 00:31:09.289 "abort": false, 00:31:09.289 "seek_hole": false, 00:31:09.289 "seek_data": false, 00:31:09.289 "copy": false, 00:31:09.289 "nvme_iov_md": false 00:31:09.289 }, 00:31:09.289 "driver_specific": { 00:31:09.289 "compress": { 00:31:09.289 "name": "COMP_lvs0/lv0", 00:31:09.289 "base_bdev_name": "17facace-4521-4b66-8814-5e3dada0f671" 00:31:09.289 } 00:31:09.289 } 00:31:09.289 } 00:31:09.289 ] 00:31:09.289 22:59:54 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:09.289 22:59:54 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:09.289 [2024-07-15 22:59:54.182065] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd23c1b15c0 PMD being used: compress_qat 00:31:09.289 [2024-07-15 22:59:54.185260] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1cb1700 PMD being used: compress_qat 00:31:09.289 Running I/O for 3 seconds... 00:31:12.611 00:31:12.611 Latency(us) 00:31:12.611 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:12.611 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:12.611 Verification LBA range: start 0x0 length 0x3100 00:31:12.611 COMP_lvs0/lv0 : 3.01 1686.69 6.59 0.00 0.00 18884.79 2194.03 17894.18 00:31:12.611 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:12.611 Verification LBA range: start 0x3100 length 0x3100 00:31:12.611 COMP_lvs0/lv0 : 3.01 1788.64 6.99 0.00 0.00 17776.41 1303.60 14588.88 00:31:12.611 =================================================================================================================== 00:31:12.611 Total : 3475.34 13.58 0.00 0.00 18314.34 1303.60 17894.18 00:31:12.611 0 00:31:12.611 22:59:57 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:12.611 22:59:57 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:12.611 22:59:57 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:12.870 22:59:57 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:12.870 22:59:57 compress_compdev -- compress/compress.sh@78 -- # killprocess 2872407 00:31:12.870 22:59:57 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2872407 ']' 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2872407 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2872407 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2872407' 00:31:12.871 killing process with pid 2872407 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@967 -- # kill 2872407 00:31:12.871 Received shutdown signal, test time was about 3.000000 seconds 00:31:12.871 00:31:12.871 Latency(us) 00:31:12.871 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:12.871 =================================================================================================================== 00:31:12.871 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:12.871 22:59:57 compress_compdev -- common/autotest_common.sh@972 -- # wait 2872407 00:31:16.158 23:00:00 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:16.158 23:00:00 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:16.158 23:00:00 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2874052 00:31:16.158 23:00:00 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:16.158 23:00:00 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:16.158 23:00:00 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2874052 00:31:16.158 23:00:00 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2874052 ']' 00:31:16.158 23:00:00 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:16.158 23:00:00 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:16.158 23:00:00 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:16.158 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:16.158 23:00:00 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:16.158 23:00:00 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:16.158 [2024-07-15 23:00:00.654066] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:31:16.159 [2024-07-15 23:00:00.654132] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2874052 ] 00:31:16.159 [2024-07-15 23:00:00.774932] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:16.159 [2024-07-15 23:00:00.891756] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:16.159 [2024-07-15 23:00:00.891761] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:17.102 [2024-07-15 23:00:01.846348] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:17.102 23:00:01 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:17.102 23:00:01 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:17.102 23:00:01 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:31:17.102 23:00:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:17.102 23:00:01 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:17.668 [2024-07-15 23:00:02.520600] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x27103c0 PMD being used: compress_qat 00:31:17.668 23:00:02 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:17.668 23:00:02 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:17.668 23:00:02 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:17.668 23:00:02 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:17.668 23:00:02 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:17.668 23:00:02 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:17.668 23:00:02 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:17.925 23:00:02 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:18.183 [ 00:31:18.183 { 00:31:18.183 "name": "Nvme0n1", 00:31:18.183 "aliases": [ 00:31:18.183 "01000000-0000-0000-5cd2-e43197705251" 00:31:18.183 ], 00:31:18.183 "product_name": "NVMe disk", 00:31:18.183 "block_size": 512, 00:31:18.183 "num_blocks": 15002931888, 00:31:18.183 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:18.183 "assigned_rate_limits": { 00:31:18.183 "rw_ios_per_sec": 0, 00:31:18.183 "rw_mbytes_per_sec": 0, 00:31:18.183 "r_mbytes_per_sec": 0, 00:31:18.183 "w_mbytes_per_sec": 0 00:31:18.183 }, 00:31:18.183 "claimed": false, 00:31:18.183 "zoned": false, 00:31:18.183 "supported_io_types": { 00:31:18.183 "read": true, 00:31:18.183 "write": true, 00:31:18.183 "unmap": true, 00:31:18.183 "flush": true, 00:31:18.183 "reset": true, 00:31:18.183 "nvme_admin": true, 00:31:18.183 "nvme_io": true, 00:31:18.184 "nvme_io_md": false, 00:31:18.184 "write_zeroes": true, 00:31:18.184 "zcopy": false, 00:31:18.184 "get_zone_info": false, 00:31:18.184 "zone_management": false, 00:31:18.184 "zone_append": false, 00:31:18.184 "compare": false, 00:31:18.184 "compare_and_write": false, 00:31:18.184 "abort": true, 00:31:18.184 "seek_hole": false, 00:31:18.184 "seek_data": false, 00:31:18.184 "copy": false, 00:31:18.184 "nvme_iov_md": false 00:31:18.184 }, 00:31:18.184 "driver_specific": { 00:31:18.184 "nvme": [ 00:31:18.184 { 00:31:18.184 "pci_address": "0000:5e:00.0", 00:31:18.184 "trid": { 00:31:18.184 "trtype": "PCIe", 00:31:18.184 "traddr": "0000:5e:00.0" 00:31:18.184 }, 00:31:18.184 "ctrlr_data": { 00:31:18.184 "cntlid": 0, 00:31:18.184 "vendor_id": "0x8086", 00:31:18.184 "model_number": "INTEL SSDPF2KX076TZO", 00:31:18.184 "serial_number": "PHAC0301002G7P6CGN", 00:31:18.184 "firmware_revision": "JCV10200", 00:31:18.184 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:18.184 "oacs": { 00:31:18.184 "security": 1, 00:31:18.184 "format": 1, 00:31:18.184 "firmware": 1, 00:31:18.184 "ns_manage": 1 00:31:18.184 }, 00:31:18.184 "multi_ctrlr": false, 00:31:18.184 "ana_reporting": false 00:31:18.184 }, 00:31:18.184 "vs": { 00:31:18.184 "nvme_version": "1.3" 00:31:18.184 }, 00:31:18.184 "ns_data": { 00:31:18.184 "id": 1, 00:31:18.184 "can_share": false 00:31:18.184 }, 00:31:18.184 "security": { 00:31:18.184 "opal": true 00:31:18.184 } 00:31:18.184 } 00:31:18.184 ], 00:31:18.184 "mp_policy": "active_passive" 00:31:18.184 } 00:31:18.184 } 00:31:18.184 ] 00:31:18.184 23:00:03 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:18.184 23:00:03 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:18.442 [2024-07-15 23:00:03.199238] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25750d0 PMD being used: compress_qat 00:31:20.966 09203c74-0ac0-4d26-b308-1dc68a380ff1 00:31:20.966 23:00:05 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:20.966 fff1d1ab-8cf5-4667-a1ff-b65416423a48 00:31:20.966 23:00:05 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:20.966 23:00:05 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:20.966 23:00:05 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:20.966 23:00:05 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:20.966 23:00:05 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:20.967 23:00:05 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:20.967 23:00:05 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:21.223 23:00:05 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:21.510 [ 00:31:21.510 { 00:31:21.510 "name": "fff1d1ab-8cf5-4667-a1ff-b65416423a48", 00:31:21.510 "aliases": [ 00:31:21.510 "lvs0/lv0" 00:31:21.510 ], 00:31:21.510 "product_name": "Logical Volume", 00:31:21.510 "block_size": 512, 00:31:21.510 "num_blocks": 204800, 00:31:21.510 "uuid": "fff1d1ab-8cf5-4667-a1ff-b65416423a48", 00:31:21.510 "assigned_rate_limits": { 00:31:21.510 "rw_ios_per_sec": 0, 00:31:21.510 "rw_mbytes_per_sec": 0, 00:31:21.510 "r_mbytes_per_sec": 0, 00:31:21.510 "w_mbytes_per_sec": 0 00:31:21.510 }, 00:31:21.510 "claimed": false, 00:31:21.510 "zoned": false, 00:31:21.510 "supported_io_types": { 00:31:21.510 "read": true, 00:31:21.510 "write": true, 00:31:21.510 "unmap": true, 00:31:21.510 "flush": false, 00:31:21.510 "reset": true, 00:31:21.510 "nvme_admin": false, 00:31:21.510 "nvme_io": false, 00:31:21.510 "nvme_io_md": false, 00:31:21.510 "write_zeroes": true, 00:31:21.510 "zcopy": false, 00:31:21.510 "get_zone_info": false, 00:31:21.511 "zone_management": false, 00:31:21.511 "zone_append": false, 00:31:21.511 "compare": false, 00:31:21.511 "compare_and_write": false, 00:31:21.511 "abort": false, 00:31:21.511 "seek_hole": true, 00:31:21.511 "seek_data": true, 00:31:21.511 "copy": false, 00:31:21.511 "nvme_iov_md": false 00:31:21.511 }, 00:31:21.511 "driver_specific": { 00:31:21.511 "lvol": { 00:31:21.511 "lvol_store_uuid": "09203c74-0ac0-4d26-b308-1dc68a380ff1", 00:31:21.511 "base_bdev": "Nvme0n1", 00:31:21.511 "thin_provision": true, 00:31:21.511 "num_allocated_clusters": 0, 00:31:21.511 "snapshot": false, 00:31:21.511 "clone": false, 00:31:21.511 "esnap_clone": false 00:31:21.511 } 00:31:21.511 } 00:31:21.511 } 00:31:21.511 ] 00:31:21.511 23:00:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:21.511 23:00:06 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:21.511 23:00:06 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:21.768 [2024-07-15 23:00:06.444177] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:21.768 COMP_lvs0/lv0 00:31:21.768 23:00:06 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:21.768 23:00:06 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:21.768 23:00:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:21.768 23:00:06 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:21.768 23:00:06 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:21.768 23:00:06 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:21.768 23:00:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:22.026 23:00:06 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:22.026 [ 00:31:22.026 { 00:31:22.026 "name": "COMP_lvs0/lv0", 00:31:22.026 "aliases": [ 00:31:22.026 "7fe8c311-955f-51f9-9f1b-38f0b2f1e85f" 00:31:22.026 ], 00:31:22.026 "product_name": "compress", 00:31:22.026 "block_size": 4096, 00:31:22.026 "num_blocks": 25088, 00:31:22.026 "uuid": "7fe8c311-955f-51f9-9f1b-38f0b2f1e85f", 00:31:22.026 "assigned_rate_limits": { 00:31:22.026 "rw_ios_per_sec": 0, 00:31:22.026 "rw_mbytes_per_sec": 0, 00:31:22.026 "r_mbytes_per_sec": 0, 00:31:22.026 "w_mbytes_per_sec": 0 00:31:22.026 }, 00:31:22.026 "claimed": false, 00:31:22.026 "zoned": false, 00:31:22.026 "supported_io_types": { 00:31:22.026 "read": true, 00:31:22.026 "write": true, 00:31:22.026 "unmap": false, 00:31:22.026 "flush": false, 00:31:22.026 "reset": false, 00:31:22.026 "nvme_admin": false, 00:31:22.026 "nvme_io": false, 00:31:22.026 "nvme_io_md": false, 00:31:22.026 "write_zeroes": true, 00:31:22.026 "zcopy": false, 00:31:22.026 "get_zone_info": false, 00:31:22.026 "zone_management": false, 00:31:22.026 "zone_append": false, 00:31:22.026 "compare": false, 00:31:22.026 "compare_and_write": false, 00:31:22.026 "abort": false, 00:31:22.026 "seek_hole": false, 00:31:22.026 "seek_data": false, 00:31:22.026 "copy": false, 00:31:22.026 "nvme_iov_md": false 00:31:22.026 }, 00:31:22.026 "driver_specific": { 00:31:22.026 "compress": { 00:31:22.026 "name": "COMP_lvs0/lv0", 00:31:22.026 "base_bdev_name": "fff1d1ab-8cf5-4667-a1ff-b65416423a48" 00:31:22.026 } 00:31:22.026 } 00:31:22.026 } 00:31:22.026 ] 00:31:22.026 23:00:06 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:22.026 23:00:06 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:22.285 [2024-07-15 23:00:06.990921] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd2201b15c0 PMD being used: compress_qat 00:31:22.285 [2024-07-15 23:00:06.994302] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x270d700 PMD being used: compress_qat 00:31:22.285 Running I/O for 3 seconds... 00:31:25.571 00:31:25.571 Latency(us) 00:31:25.571 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:25.571 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:25.571 Verification LBA range: start 0x0 length 0x3100 00:31:25.571 COMP_lvs0/lv0 : 3.01 1671.72 6.53 0.00 0.00 19040.50 2265.27 18008.15 00:31:25.571 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:25.571 Verification LBA range: start 0x3100 length 0x3100 00:31:25.571 COMP_lvs0/lv0 : 3.01 1780.64 6.96 0.00 0.00 17858.55 1203.87 14816.83 00:31:25.571 =================================================================================================================== 00:31:25.571 Total : 3452.37 13.49 0.00 0.00 18431.11 1203.87 18008.15 00:31:25.571 0 00:31:25.571 23:00:10 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:31:25.571 23:00:10 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:25.571 23:00:10 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:25.828 23:00:10 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:25.828 23:00:10 compress_compdev -- compress/compress.sh@78 -- # killprocess 2874052 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2874052 ']' 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2874052 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2874052 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2874052' 00:31:25.828 killing process with pid 2874052 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@967 -- # kill 2874052 00:31:25.828 Received shutdown signal, test time was about 3.000000 seconds 00:31:25.828 00:31:25.828 Latency(us) 00:31:25.828 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:25.828 =================================================================================================================== 00:31:25.828 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:25.828 23:00:10 compress_compdev -- common/autotest_common.sh@972 -- # wait 2874052 00:31:29.121 23:00:13 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:31:29.121 23:00:13 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:29.121 23:00:13 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2876127 00:31:29.121 23:00:13 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:29.121 23:00:13 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:31:29.121 23:00:13 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2876127 00:31:29.121 23:00:13 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2876127 ']' 00:31:29.121 23:00:13 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:29.121 23:00:13 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:29.121 23:00:13 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:29.121 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:29.121 23:00:13 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:29.121 23:00:13 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:29.121 [2024-07-15 23:00:13.672170] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:31:29.121 [2024-07-15 23:00:13.672227] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2876127 ] 00:31:29.121 [2024-07-15 23:00:13.783828] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:29.121 [2024-07-15 23:00:13.890863] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:29.121 [2024-07-15 23:00:13.890977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:29.121 [2024-07-15 23:00:13.890979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:30.055 [2024-07-15 23:00:14.640981] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:30.055 23:00:14 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:30.055 23:00:14 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:30.055 23:00:14 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:31:30.055 23:00:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:30.055 23:00:14 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:30.647 [2024-07-15 23:00:15.295752] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x153ff20 PMD being used: compress_qat 00:31:30.647 23:00:15 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:30.647 23:00:15 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:30.647 23:00:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:30.647 23:00:15 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:30.647 23:00:15 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:30.647 23:00:15 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:30.647 23:00:15 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:30.906 23:00:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:30.906 [ 00:31:30.906 { 00:31:30.906 "name": "Nvme0n1", 00:31:30.906 "aliases": [ 00:31:30.906 "01000000-0000-0000-5cd2-e43197705251" 00:31:30.906 ], 00:31:30.906 "product_name": "NVMe disk", 00:31:30.906 "block_size": 512, 00:31:30.906 "num_blocks": 15002931888, 00:31:30.906 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:30.906 "assigned_rate_limits": { 00:31:30.906 "rw_ios_per_sec": 0, 00:31:30.906 "rw_mbytes_per_sec": 0, 00:31:30.906 "r_mbytes_per_sec": 0, 00:31:30.906 "w_mbytes_per_sec": 0 00:31:30.906 }, 00:31:30.906 "claimed": false, 00:31:30.906 "zoned": false, 00:31:30.906 "supported_io_types": { 00:31:30.906 "read": true, 00:31:30.906 "write": true, 00:31:30.906 "unmap": true, 00:31:30.906 "flush": true, 00:31:30.906 "reset": true, 00:31:30.906 "nvme_admin": true, 00:31:30.906 "nvme_io": true, 00:31:30.906 "nvme_io_md": false, 00:31:30.906 "write_zeroes": true, 00:31:30.906 "zcopy": false, 00:31:30.906 "get_zone_info": false, 00:31:30.906 "zone_management": false, 00:31:30.906 "zone_append": false, 00:31:30.906 "compare": false, 00:31:30.906 "compare_and_write": false, 00:31:30.906 "abort": true, 00:31:30.906 "seek_hole": false, 00:31:30.906 "seek_data": false, 00:31:30.906 "copy": false, 00:31:30.906 "nvme_iov_md": false 00:31:30.906 }, 00:31:30.906 "driver_specific": { 00:31:30.906 "nvme": [ 00:31:30.906 { 00:31:30.906 "pci_address": "0000:5e:00.0", 00:31:30.906 "trid": { 00:31:30.906 "trtype": "PCIe", 00:31:30.906 "traddr": "0000:5e:00.0" 00:31:30.906 }, 00:31:30.906 "ctrlr_data": { 00:31:30.906 "cntlid": 0, 00:31:30.906 "vendor_id": "0x8086", 00:31:30.906 "model_number": "INTEL SSDPF2KX076TZO", 00:31:30.906 "serial_number": "PHAC0301002G7P6CGN", 00:31:30.906 "firmware_revision": "JCV10200", 00:31:30.906 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:30.906 "oacs": { 00:31:30.906 "security": 1, 00:31:30.906 "format": 1, 00:31:30.906 "firmware": 1, 00:31:30.906 "ns_manage": 1 00:31:30.906 }, 00:31:30.906 "multi_ctrlr": false, 00:31:30.906 "ana_reporting": false 00:31:30.906 }, 00:31:30.906 "vs": { 00:31:30.906 "nvme_version": "1.3" 00:31:30.906 }, 00:31:30.906 "ns_data": { 00:31:30.906 "id": 1, 00:31:30.906 "can_share": false 00:31:30.906 }, 00:31:30.906 "security": { 00:31:30.906 "opal": true 00:31:30.906 } 00:31:30.906 } 00:31:30.906 ], 00:31:30.906 "mp_policy": "active_passive" 00:31:30.906 } 00:31:30.906 } 00:31:30.906 ] 00:31:30.906 23:00:15 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:30.906 23:00:15 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:31.164 [2024-07-15 23:00:15.945274] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x138e390 PMD being used: compress_qat 00:31:33.710 9ae0c53f-8f45-4291-9175-cea6af177529 00:31:33.710 23:00:18 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:33.710 1c6858c7-deb2-450e-b805-4661299e2ca7 00:31:33.710 23:00:18 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:33.710 23:00:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:33.710 23:00:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:33.710 23:00:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:33.710 23:00:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:33.710 23:00:18 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:33.710 23:00:18 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:33.710 23:00:18 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:33.968 [ 00:31:33.968 { 00:31:33.968 "name": "1c6858c7-deb2-450e-b805-4661299e2ca7", 00:31:33.968 "aliases": [ 00:31:33.968 "lvs0/lv0" 00:31:33.968 ], 00:31:33.968 "product_name": "Logical Volume", 00:31:33.968 "block_size": 512, 00:31:33.968 "num_blocks": 204800, 00:31:33.968 "uuid": "1c6858c7-deb2-450e-b805-4661299e2ca7", 00:31:33.968 "assigned_rate_limits": { 00:31:33.968 "rw_ios_per_sec": 0, 00:31:33.968 "rw_mbytes_per_sec": 0, 00:31:33.968 "r_mbytes_per_sec": 0, 00:31:33.968 "w_mbytes_per_sec": 0 00:31:33.968 }, 00:31:33.968 "claimed": false, 00:31:33.968 "zoned": false, 00:31:33.968 "supported_io_types": { 00:31:33.968 "read": true, 00:31:33.968 "write": true, 00:31:33.968 "unmap": true, 00:31:33.968 "flush": false, 00:31:33.968 "reset": true, 00:31:33.968 "nvme_admin": false, 00:31:33.968 "nvme_io": false, 00:31:33.968 "nvme_io_md": false, 00:31:33.968 "write_zeroes": true, 00:31:33.968 "zcopy": false, 00:31:33.968 "get_zone_info": false, 00:31:33.968 "zone_management": false, 00:31:33.968 "zone_append": false, 00:31:33.968 "compare": false, 00:31:33.968 "compare_and_write": false, 00:31:33.968 "abort": false, 00:31:33.968 "seek_hole": true, 00:31:33.968 "seek_data": true, 00:31:33.968 "copy": false, 00:31:33.968 "nvme_iov_md": false 00:31:33.968 }, 00:31:33.968 "driver_specific": { 00:31:33.968 "lvol": { 00:31:33.968 "lvol_store_uuid": "9ae0c53f-8f45-4291-9175-cea6af177529", 00:31:33.968 "base_bdev": "Nvme0n1", 00:31:33.968 "thin_provision": true, 00:31:33.968 "num_allocated_clusters": 0, 00:31:33.968 "snapshot": false, 00:31:33.968 "clone": false, 00:31:33.968 "esnap_clone": false 00:31:33.968 } 00:31:33.968 } 00:31:33.968 } 00:31:33.968 ] 00:31:33.968 23:00:18 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:33.968 23:00:18 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:33.968 23:00:18 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:34.226 [2024-07-15 23:00:18.969111] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:34.226 COMP_lvs0/lv0 00:31:34.226 23:00:18 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:34.226 23:00:18 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:34.226 23:00:18 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:34.226 23:00:18 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:34.226 23:00:18 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:34.226 23:00:19 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:34.226 23:00:19 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:34.484 23:00:19 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:34.484 [ 00:31:34.484 { 00:31:34.484 "name": "COMP_lvs0/lv0", 00:31:34.484 "aliases": [ 00:31:34.484 "51346082-ea56-5fe9-9b53-4433f98f8366" 00:31:34.484 ], 00:31:34.484 "product_name": "compress", 00:31:34.484 "block_size": 512, 00:31:34.484 "num_blocks": 200704, 00:31:34.484 "uuid": "51346082-ea56-5fe9-9b53-4433f98f8366", 00:31:34.484 "assigned_rate_limits": { 00:31:34.484 "rw_ios_per_sec": 0, 00:31:34.484 "rw_mbytes_per_sec": 0, 00:31:34.484 "r_mbytes_per_sec": 0, 00:31:34.484 "w_mbytes_per_sec": 0 00:31:34.484 }, 00:31:34.484 "claimed": false, 00:31:34.484 "zoned": false, 00:31:34.484 "supported_io_types": { 00:31:34.484 "read": true, 00:31:34.484 "write": true, 00:31:34.484 "unmap": false, 00:31:34.484 "flush": false, 00:31:34.484 "reset": false, 00:31:34.484 "nvme_admin": false, 00:31:34.484 "nvme_io": false, 00:31:34.484 "nvme_io_md": false, 00:31:34.484 "write_zeroes": true, 00:31:34.484 "zcopy": false, 00:31:34.484 "get_zone_info": false, 00:31:34.484 "zone_management": false, 00:31:34.484 "zone_append": false, 00:31:34.484 "compare": false, 00:31:34.484 "compare_and_write": false, 00:31:34.484 "abort": false, 00:31:34.484 "seek_hole": false, 00:31:34.484 "seek_data": false, 00:31:34.484 "copy": false, 00:31:34.484 "nvme_iov_md": false 00:31:34.484 }, 00:31:34.484 "driver_specific": { 00:31:34.484 "compress": { 00:31:34.484 "name": "COMP_lvs0/lv0", 00:31:34.484 "base_bdev_name": "1c6858c7-deb2-450e-b805-4661299e2ca7" 00:31:34.484 } 00:31:34.484 } 00:31:34.484 } 00:31:34.484 ] 00:31:34.484 23:00:19 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:34.484 23:00:19 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:34.743 [2024-07-15 23:00:19.577913] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fd86c1b1350 PMD being used: compress_qat 00:31:34.743 I/O targets: 00:31:34.743 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:34.743 00:31:34.743 00:31:34.743 CUnit - A unit testing framework for C - Version 2.1-3 00:31:34.743 http://cunit.sourceforge.net/ 00:31:34.743 00:31:34.743 00:31:34.743 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:34.743 Test: blockdev write read block ...passed 00:31:34.743 Test: blockdev write zeroes read block ...passed 00:31:34.743 Test: blockdev write zeroes read no split ...passed 00:31:34.743 Test: blockdev write zeroes read split ...passed 00:31:35.003 Test: blockdev write zeroes read split partial ...passed 00:31:35.003 Test: blockdev reset ...[2024-07-15 23:00:19.682801] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:35.003 passed 00:31:35.003 Test: blockdev write read 8 blocks ...passed 00:31:35.003 Test: blockdev write read size > 128k ...passed 00:31:35.003 Test: blockdev write read invalid size ...passed 00:31:35.003 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:35.003 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:35.003 Test: blockdev write read max offset ...passed 00:31:35.003 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:35.003 Test: blockdev writev readv 8 blocks ...passed 00:31:35.003 Test: blockdev writev readv 30 x 1block ...passed 00:31:35.003 Test: blockdev writev readv block ...passed 00:31:35.003 Test: blockdev writev readv size > 128k ...passed 00:31:35.003 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:35.003 Test: blockdev comparev and writev ...passed 00:31:35.003 Test: blockdev nvme passthru rw ...passed 00:31:35.003 Test: blockdev nvme passthru vendor specific ...passed 00:31:35.003 Test: blockdev nvme admin passthru ...passed 00:31:35.003 Test: blockdev copy ...passed 00:31:35.003 00:31:35.003 Run Summary: Type Total Ran Passed Failed Inactive 00:31:35.003 suites 1 1 n/a 0 0 00:31:35.003 tests 23 23 23 0 0 00:31:35.003 asserts 130 130 130 0 n/a 00:31:35.003 00:31:35.003 Elapsed time = 0.239 seconds 00:31:35.003 0 00:31:35.003 23:00:19 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:31:35.003 23:00:19 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:35.003 23:00:19 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:35.261 23:00:20 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:35.261 23:00:20 compress_compdev -- compress/compress.sh@62 -- # killprocess 2876127 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2876127 ']' 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2876127 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2876127 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2876127' 00:31:35.261 killing process with pid 2876127 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@967 -- # kill 2876127 00:31:35.261 23:00:20 compress_compdev -- common/autotest_common.sh@972 -- # wait 2876127 00:31:38.539 23:00:23 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:38.539 23:00:23 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:38.539 00:31:38.539 real 0m49.107s 00:31:38.539 user 1m51.911s 00:31:38.539 sys 0m6.230s 00:31:38.539 23:00:23 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:38.539 23:00:23 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:38.539 ************************************ 00:31:38.539 END TEST compress_compdev 00:31:38.539 ************************************ 00:31:38.539 23:00:23 -- common/autotest_common.sh@1142 -- # return 0 00:31:38.539 23:00:23 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:38.539 23:00:23 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:38.539 23:00:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:38.539 23:00:23 -- common/autotest_common.sh@10 -- # set +x 00:31:38.539 ************************************ 00:31:38.539 START TEST compress_isal 00:31:38.539 ************************************ 00:31:38.539 23:00:23 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:38.539 * Looking for test storage... 00:31:38.539 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:38.539 23:00:23 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:38.539 23:00:23 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:38.539 23:00:23 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:38.539 23:00:23 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:38.539 23:00:23 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:38.539 23:00:23 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:38.540 23:00:23 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:38.540 23:00:23 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:38.540 23:00:23 compress_isal -- paths/export.sh@5 -- # export PATH 00:31:38.540 23:00:23 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@47 -- # : 0 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:38.540 23:00:23 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2877438 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2877438 00:31:38.540 23:00:23 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2877438 ']' 00:31:38.540 23:00:23 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:38.540 23:00:23 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:38.540 23:00:23 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:38.540 23:00:23 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:38.540 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:38.540 23:00:23 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:38.540 23:00:23 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:38.797 [2024-07-15 23:00:23.469020] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:31:38.797 [2024-07-15 23:00:23.469086] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2877438 ] 00:31:38.797 [2024-07-15 23:00:23.589866] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:39.054 [2024-07-15 23:00:23.719845] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:39.054 [2024-07-15 23:00:23.719852] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:39.619 23:00:24 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:39.619 23:00:24 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:39.619 23:00:24 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:39.619 23:00:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:39.619 23:00:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:40.184 23:00:24 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:40.184 23:00:24 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:40.184 23:00:24 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:40.184 23:00:24 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:40.184 23:00:24 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:40.184 23:00:24 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:40.184 23:00:24 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:40.441 23:00:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:41.006 [ 00:31:41.006 { 00:31:41.006 "name": "Nvme0n1", 00:31:41.006 "aliases": [ 00:31:41.006 "01000000-0000-0000-5cd2-e43197705251" 00:31:41.006 ], 00:31:41.006 "product_name": "NVMe disk", 00:31:41.006 "block_size": 512, 00:31:41.006 "num_blocks": 15002931888, 00:31:41.006 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:41.006 "assigned_rate_limits": { 00:31:41.006 "rw_ios_per_sec": 0, 00:31:41.006 "rw_mbytes_per_sec": 0, 00:31:41.006 "r_mbytes_per_sec": 0, 00:31:41.006 "w_mbytes_per_sec": 0 00:31:41.006 }, 00:31:41.006 "claimed": false, 00:31:41.006 "zoned": false, 00:31:41.006 "supported_io_types": { 00:31:41.006 "read": true, 00:31:41.006 "write": true, 00:31:41.006 "unmap": true, 00:31:41.006 "flush": true, 00:31:41.006 "reset": true, 00:31:41.006 "nvme_admin": true, 00:31:41.006 "nvme_io": true, 00:31:41.006 "nvme_io_md": false, 00:31:41.006 "write_zeroes": true, 00:31:41.006 "zcopy": false, 00:31:41.006 "get_zone_info": false, 00:31:41.006 "zone_management": false, 00:31:41.006 "zone_append": false, 00:31:41.006 "compare": false, 00:31:41.006 "compare_and_write": false, 00:31:41.006 "abort": true, 00:31:41.006 "seek_hole": false, 00:31:41.006 "seek_data": false, 00:31:41.006 "copy": false, 00:31:41.006 "nvme_iov_md": false 00:31:41.006 }, 00:31:41.006 "driver_specific": { 00:31:41.006 "nvme": [ 00:31:41.006 { 00:31:41.006 "pci_address": "0000:5e:00.0", 00:31:41.006 "trid": { 00:31:41.006 "trtype": "PCIe", 00:31:41.006 "traddr": "0000:5e:00.0" 00:31:41.006 }, 00:31:41.006 "ctrlr_data": { 00:31:41.006 "cntlid": 0, 00:31:41.006 "vendor_id": "0x8086", 00:31:41.006 "model_number": "INTEL SSDPF2KX076TZO", 00:31:41.006 "serial_number": "PHAC0301002G7P6CGN", 00:31:41.006 "firmware_revision": "JCV10200", 00:31:41.006 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:41.006 "oacs": { 00:31:41.006 "security": 1, 00:31:41.006 "format": 1, 00:31:41.006 "firmware": 1, 00:31:41.006 "ns_manage": 1 00:31:41.006 }, 00:31:41.006 "multi_ctrlr": false, 00:31:41.007 "ana_reporting": false 00:31:41.007 }, 00:31:41.007 "vs": { 00:31:41.007 "nvme_version": "1.3" 00:31:41.007 }, 00:31:41.007 "ns_data": { 00:31:41.007 "id": 1, 00:31:41.007 "can_share": false 00:31:41.007 }, 00:31:41.007 "security": { 00:31:41.007 "opal": true 00:31:41.007 } 00:31:41.007 } 00:31:41.007 ], 00:31:41.007 "mp_policy": "active_passive" 00:31:41.007 } 00:31:41.007 } 00:31:41.007 ] 00:31:41.007 23:00:25 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:41.007 23:00:25 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:43.539 70189f78-d8dc-47c2-b69e-5203bfa30b19 00:31:43.539 23:00:28 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:43.801 68110ee3-81f9-4a02-ad5d-93f7bbfc4d22 00:31:43.801 23:00:28 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:43.801 23:00:28 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:43.801 23:00:28 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:43.801 23:00:28 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:43.801 23:00:28 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:43.801 23:00:28 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:43.801 23:00:28 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:44.059 23:00:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:44.317 [ 00:31:44.317 { 00:31:44.317 "name": "68110ee3-81f9-4a02-ad5d-93f7bbfc4d22", 00:31:44.317 "aliases": [ 00:31:44.317 "lvs0/lv0" 00:31:44.317 ], 00:31:44.317 "product_name": "Logical Volume", 00:31:44.317 "block_size": 512, 00:31:44.317 "num_blocks": 204800, 00:31:44.317 "uuid": "68110ee3-81f9-4a02-ad5d-93f7bbfc4d22", 00:31:44.317 "assigned_rate_limits": { 00:31:44.317 "rw_ios_per_sec": 0, 00:31:44.317 "rw_mbytes_per_sec": 0, 00:31:44.317 "r_mbytes_per_sec": 0, 00:31:44.317 "w_mbytes_per_sec": 0 00:31:44.317 }, 00:31:44.317 "claimed": false, 00:31:44.317 "zoned": false, 00:31:44.317 "supported_io_types": { 00:31:44.317 "read": true, 00:31:44.317 "write": true, 00:31:44.317 "unmap": true, 00:31:44.317 "flush": false, 00:31:44.317 "reset": true, 00:31:44.317 "nvme_admin": false, 00:31:44.317 "nvme_io": false, 00:31:44.317 "nvme_io_md": false, 00:31:44.317 "write_zeroes": true, 00:31:44.317 "zcopy": false, 00:31:44.317 "get_zone_info": false, 00:31:44.317 "zone_management": false, 00:31:44.317 "zone_append": false, 00:31:44.317 "compare": false, 00:31:44.317 "compare_and_write": false, 00:31:44.317 "abort": false, 00:31:44.317 "seek_hole": true, 00:31:44.317 "seek_data": true, 00:31:44.317 "copy": false, 00:31:44.317 "nvme_iov_md": false 00:31:44.317 }, 00:31:44.317 "driver_specific": { 00:31:44.317 "lvol": { 00:31:44.317 "lvol_store_uuid": "70189f78-d8dc-47c2-b69e-5203bfa30b19", 00:31:44.317 "base_bdev": "Nvme0n1", 00:31:44.317 "thin_provision": true, 00:31:44.317 "num_allocated_clusters": 0, 00:31:44.317 "snapshot": false, 00:31:44.317 "clone": false, 00:31:44.317 "esnap_clone": false 00:31:44.317 } 00:31:44.317 } 00:31:44.317 } 00:31:44.317 ] 00:31:44.317 23:00:29 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:44.317 23:00:29 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:44.317 23:00:29 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:44.317 [2024-07-15 23:00:29.212232] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:44.317 COMP_lvs0/lv0 00:31:44.576 23:00:29 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:44.576 23:00:29 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:44.576 23:00:29 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:44.576 23:00:29 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:44.576 23:00:29 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:44.576 23:00:29 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:44.576 23:00:29 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:44.834 23:00:29 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:44.834 [ 00:31:44.834 { 00:31:44.834 "name": "COMP_lvs0/lv0", 00:31:44.834 "aliases": [ 00:31:44.834 "690752cc-a1b9-5136-8818-bdb2940d87b5" 00:31:44.834 ], 00:31:44.834 "product_name": "compress", 00:31:44.834 "block_size": 512, 00:31:44.834 "num_blocks": 200704, 00:31:44.834 "uuid": "690752cc-a1b9-5136-8818-bdb2940d87b5", 00:31:44.834 "assigned_rate_limits": { 00:31:44.834 "rw_ios_per_sec": 0, 00:31:44.834 "rw_mbytes_per_sec": 0, 00:31:44.834 "r_mbytes_per_sec": 0, 00:31:44.834 "w_mbytes_per_sec": 0 00:31:44.834 }, 00:31:44.834 "claimed": false, 00:31:44.834 "zoned": false, 00:31:44.834 "supported_io_types": { 00:31:44.834 "read": true, 00:31:44.834 "write": true, 00:31:44.834 "unmap": false, 00:31:44.834 "flush": false, 00:31:44.834 "reset": false, 00:31:44.834 "nvme_admin": false, 00:31:44.834 "nvme_io": false, 00:31:44.834 "nvme_io_md": false, 00:31:44.834 "write_zeroes": true, 00:31:44.834 "zcopy": false, 00:31:44.834 "get_zone_info": false, 00:31:44.834 "zone_management": false, 00:31:44.834 "zone_append": false, 00:31:44.834 "compare": false, 00:31:44.834 "compare_and_write": false, 00:31:44.834 "abort": false, 00:31:44.834 "seek_hole": false, 00:31:44.834 "seek_data": false, 00:31:44.834 "copy": false, 00:31:44.834 "nvme_iov_md": false 00:31:44.834 }, 00:31:44.834 "driver_specific": { 00:31:44.834 "compress": { 00:31:44.834 "name": "COMP_lvs0/lv0", 00:31:44.834 "base_bdev_name": "68110ee3-81f9-4a02-ad5d-93f7bbfc4d22" 00:31:44.834 } 00:31:44.834 } 00:31:44.834 } 00:31:44.834 ] 00:31:45.093 23:00:29 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:45.093 23:00:29 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:45.093 Running I/O for 3 seconds... 00:31:48.381 00:31:48.381 Latency(us) 00:31:48.381 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:48.381 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:48.381 Verification LBA range: start 0x0 length 0x3100 00:31:48.381 COMP_lvs0/lv0 : 3.02 1272.07 4.97 0.00 0.00 25026.80 2350.75 21427.42 00:31:48.381 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:48.381 Verification LBA range: start 0x3100 length 0x3100 00:31:48.381 COMP_lvs0/lv0 : 3.01 1274.68 4.98 0.00 0.00 24961.32 1467.44 20401.64 00:31:48.381 =================================================================================================================== 00:31:48.381 Total : 2546.74 9.95 0.00 0.00 24994.04 1467.44 21427.42 00:31:48.381 0 00:31:48.381 23:00:32 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:48.381 23:00:32 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:48.381 23:00:33 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:48.639 23:00:33 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:48.639 23:00:33 compress_isal -- compress/compress.sh@78 -- # killprocess 2877438 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2877438 ']' 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2877438 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2877438 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2877438' 00:31:48.639 killing process with pid 2877438 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@967 -- # kill 2877438 00:31:48.639 Received shutdown signal, test time was about 3.000000 seconds 00:31:48.639 00:31:48.639 Latency(us) 00:31:48.639 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:48.639 =================================================================================================================== 00:31:48.639 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:48.639 23:00:33 compress_isal -- common/autotest_common.sh@972 -- # wait 2877438 00:31:51.923 23:00:36 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:51.923 23:00:36 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:51.923 23:00:36 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2879198 00:31:51.923 23:00:36 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:51.923 23:00:36 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:51.923 23:00:36 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2879198 00:31:51.923 23:00:36 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2879198 ']' 00:31:51.923 23:00:36 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:51.923 23:00:36 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:51.923 23:00:36 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:51.923 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:51.923 23:00:36 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:51.923 23:00:36 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:51.923 [2024-07-15 23:00:36.587990] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:31:51.923 [2024-07-15 23:00:36.588067] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2879198 ] 00:31:51.923 [2024-07-15 23:00:36.724034] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:52.181 [2024-07-15 23:00:36.841968] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:52.181 [2024-07-15 23:00:36.841975] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:52.754 23:00:37 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:52.754 23:00:37 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:52.754 23:00:37 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:31:52.754 23:00:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:52.754 23:00:37 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:53.322 23:00:38 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:53.322 23:00:38 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:53.322 23:00:38 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:53.322 23:00:38 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:53.322 23:00:38 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:53.322 23:00:38 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:53.322 23:00:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:53.581 23:00:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:53.889 [ 00:31:53.889 { 00:31:53.889 "name": "Nvme0n1", 00:31:53.889 "aliases": [ 00:31:53.889 "01000000-0000-0000-5cd2-e43197705251" 00:31:53.889 ], 00:31:53.889 "product_name": "NVMe disk", 00:31:53.889 "block_size": 512, 00:31:53.889 "num_blocks": 15002931888, 00:31:53.889 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:53.889 "assigned_rate_limits": { 00:31:53.889 "rw_ios_per_sec": 0, 00:31:53.889 "rw_mbytes_per_sec": 0, 00:31:53.889 "r_mbytes_per_sec": 0, 00:31:53.889 "w_mbytes_per_sec": 0 00:31:53.889 }, 00:31:53.889 "claimed": false, 00:31:53.889 "zoned": false, 00:31:53.889 "supported_io_types": { 00:31:53.889 "read": true, 00:31:53.889 "write": true, 00:31:53.889 "unmap": true, 00:31:53.889 "flush": true, 00:31:53.889 "reset": true, 00:31:53.889 "nvme_admin": true, 00:31:53.889 "nvme_io": true, 00:31:53.889 "nvme_io_md": false, 00:31:53.889 "write_zeroes": true, 00:31:53.889 "zcopy": false, 00:31:53.889 "get_zone_info": false, 00:31:53.889 "zone_management": false, 00:31:53.889 "zone_append": false, 00:31:53.889 "compare": false, 00:31:53.889 "compare_and_write": false, 00:31:53.889 "abort": true, 00:31:53.889 "seek_hole": false, 00:31:53.889 "seek_data": false, 00:31:53.889 "copy": false, 00:31:53.889 "nvme_iov_md": false 00:31:53.889 }, 00:31:53.889 "driver_specific": { 00:31:53.889 "nvme": [ 00:31:53.889 { 00:31:53.889 "pci_address": "0000:5e:00.0", 00:31:53.889 "trid": { 00:31:53.889 "trtype": "PCIe", 00:31:53.889 "traddr": "0000:5e:00.0" 00:31:53.889 }, 00:31:53.889 "ctrlr_data": { 00:31:53.889 "cntlid": 0, 00:31:53.889 "vendor_id": "0x8086", 00:31:53.889 "model_number": "INTEL SSDPF2KX076TZO", 00:31:53.889 "serial_number": "PHAC0301002G7P6CGN", 00:31:53.889 "firmware_revision": "JCV10200", 00:31:53.889 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:53.889 "oacs": { 00:31:53.889 "security": 1, 00:31:53.889 "format": 1, 00:31:53.889 "firmware": 1, 00:31:53.889 "ns_manage": 1 00:31:53.889 }, 00:31:53.889 "multi_ctrlr": false, 00:31:53.889 "ana_reporting": false 00:31:53.889 }, 00:31:53.889 "vs": { 00:31:53.889 "nvme_version": "1.3" 00:31:53.889 }, 00:31:53.889 "ns_data": { 00:31:53.889 "id": 1, 00:31:53.889 "can_share": false 00:31:53.889 }, 00:31:53.889 "security": { 00:31:53.889 "opal": true 00:31:53.889 } 00:31:53.889 } 00:31:53.889 ], 00:31:53.889 "mp_policy": "active_passive" 00:31:53.889 } 00:31:53.889 } 00:31:53.889 ] 00:31:53.889 23:00:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:53.889 23:00:38 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:56.427 9ce29930-d5aa-4efd-8ff5-ebbedfd1298f 00:31:56.427 23:00:41 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:56.685 7a38bc31-3f8d-43b0-a7c7-95541c603df6 00:31:56.685 23:00:41 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:56.685 23:00:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:56.685 23:00:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:56.685 23:00:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:56.685 23:00:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:56.685 23:00:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:56.685 23:00:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:56.943 23:00:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:57.200 [ 00:31:57.200 { 00:31:57.200 "name": "7a38bc31-3f8d-43b0-a7c7-95541c603df6", 00:31:57.200 "aliases": [ 00:31:57.200 "lvs0/lv0" 00:31:57.200 ], 00:31:57.200 "product_name": "Logical Volume", 00:31:57.200 "block_size": 512, 00:31:57.200 "num_blocks": 204800, 00:31:57.200 "uuid": "7a38bc31-3f8d-43b0-a7c7-95541c603df6", 00:31:57.200 "assigned_rate_limits": { 00:31:57.200 "rw_ios_per_sec": 0, 00:31:57.200 "rw_mbytes_per_sec": 0, 00:31:57.200 "r_mbytes_per_sec": 0, 00:31:57.200 "w_mbytes_per_sec": 0 00:31:57.200 }, 00:31:57.200 "claimed": false, 00:31:57.200 "zoned": false, 00:31:57.200 "supported_io_types": { 00:31:57.200 "read": true, 00:31:57.200 "write": true, 00:31:57.200 "unmap": true, 00:31:57.200 "flush": false, 00:31:57.200 "reset": true, 00:31:57.200 "nvme_admin": false, 00:31:57.200 "nvme_io": false, 00:31:57.200 "nvme_io_md": false, 00:31:57.200 "write_zeroes": true, 00:31:57.200 "zcopy": false, 00:31:57.200 "get_zone_info": false, 00:31:57.200 "zone_management": false, 00:31:57.200 "zone_append": false, 00:31:57.200 "compare": false, 00:31:57.200 "compare_and_write": false, 00:31:57.200 "abort": false, 00:31:57.200 "seek_hole": true, 00:31:57.200 "seek_data": true, 00:31:57.200 "copy": false, 00:31:57.200 "nvme_iov_md": false 00:31:57.200 }, 00:31:57.200 "driver_specific": { 00:31:57.200 "lvol": { 00:31:57.200 "lvol_store_uuid": "9ce29930-d5aa-4efd-8ff5-ebbedfd1298f", 00:31:57.200 "base_bdev": "Nvme0n1", 00:31:57.200 "thin_provision": true, 00:31:57.200 "num_allocated_clusters": 0, 00:31:57.200 "snapshot": false, 00:31:57.200 "clone": false, 00:31:57.200 "esnap_clone": false 00:31:57.200 } 00:31:57.200 } 00:31:57.200 } 00:31:57.200 ] 00:31:57.200 23:00:41 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:57.200 23:00:41 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:57.200 23:00:41 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:57.457 [2024-07-15 23:00:42.124413] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:57.457 COMP_lvs0/lv0 00:31:57.457 23:00:42 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:57.457 23:00:42 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:57.457 23:00:42 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:57.457 23:00:42 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:57.457 23:00:42 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:57.457 23:00:42 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:57.457 23:00:42 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:57.716 23:00:42 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:57.974 [ 00:31:57.974 { 00:31:57.974 "name": "COMP_lvs0/lv0", 00:31:57.974 "aliases": [ 00:31:57.974 "f7cb9483-36dd-5f15-8039-598831597ce0" 00:31:57.974 ], 00:31:57.974 "product_name": "compress", 00:31:57.974 "block_size": 512, 00:31:57.974 "num_blocks": 200704, 00:31:57.974 "uuid": "f7cb9483-36dd-5f15-8039-598831597ce0", 00:31:57.974 "assigned_rate_limits": { 00:31:57.974 "rw_ios_per_sec": 0, 00:31:57.974 "rw_mbytes_per_sec": 0, 00:31:57.974 "r_mbytes_per_sec": 0, 00:31:57.974 "w_mbytes_per_sec": 0 00:31:57.974 }, 00:31:57.974 "claimed": false, 00:31:57.974 "zoned": false, 00:31:57.974 "supported_io_types": { 00:31:57.974 "read": true, 00:31:57.974 "write": true, 00:31:57.974 "unmap": false, 00:31:57.974 "flush": false, 00:31:57.974 "reset": false, 00:31:57.974 "nvme_admin": false, 00:31:57.974 "nvme_io": false, 00:31:57.974 "nvme_io_md": false, 00:31:57.974 "write_zeroes": true, 00:31:57.974 "zcopy": false, 00:31:57.974 "get_zone_info": false, 00:31:57.974 "zone_management": false, 00:31:57.974 "zone_append": false, 00:31:57.974 "compare": false, 00:31:57.974 "compare_and_write": false, 00:31:57.974 "abort": false, 00:31:57.974 "seek_hole": false, 00:31:57.974 "seek_data": false, 00:31:57.974 "copy": false, 00:31:57.974 "nvme_iov_md": false 00:31:57.974 }, 00:31:57.974 "driver_specific": { 00:31:57.974 "compress": { 00:31:57.974 "name": "COMP_lvs0/lv0", 00:31:57.974 "base_bdev_name": "7a38bc31-3f8d-43b0-a7c7-95541c603df6" 00:31:57.974 } 00:31:57.974 } 00:31:57.974 } 00:31:57.974 ] 00:31:57.974 23:00:42 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:57.974 23:00:42 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:57.974 Running I/O for 3 seconds... 00:32:01.254 00:32:01.254 Latency(us) 00:32:01.254 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:01.254 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:01.254 Verification LBA range: start 0x0 length 0x3100 00:32:01.254 COMP_lvs0/lv0 : 3.01 2110.16 8.24 0.00 0.00 15077.70 1097.02 12879.25 00:32:01.254 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:01.254 Verification LBA range: start 0x3100 length 0x3100 00:32:01.254 COMP_lvs0/lv0 : 3.01 2106.24 8.23 0.00 0.00 15060.21 1389.08 13392.14 00:32:01.254 =================================================================================================================== 00:32:01.254 Total : 4216.41 16.47 0.00 0.00 15068.96 1097.02 13392.14 00:32:01.254 0 00:32:01.254 23:00:45 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:01.254 23:00:45 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:01.254 23:00:46 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:01.512 23:00:46 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:01.512 23:00:46 compress_isal -- compress/compress.sh@78 -- # killprocess 2879198 00:32:01.512 23:00:46 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2879198 ']' 00:32:01.512 23:00:46 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2879198 00:32:01.512 23:00:46 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:01.512 23:00:46 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:01.512 23:00:46 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2879198 00:32:01.770 23:00:46 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:01.770 23:00:46 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:01.770 23:00:46 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2879198' 00:32:01.770 killing process with pid 2879198 00:32:01.771 23:00:46 compress_isal -- common/autotest_common.sh@967 -- # kill 2879198 00:32:01.771 Received shutdown signal, test time was about 3.000000 seconds 00:32:01.771 00:32:01.771 Latency(us) 00:32:01.771 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:01.771 =================================================================================================================== 00:32:01.771 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:01.771 23:00:46 compress_isal -- common/autotest_common.sh@972 -- # wait 2879198 00:32:05.056 23:00:49 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:32:05.056 23:00:49 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:05.056 23:00:49 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2880802 00:32:05.056 23:00:49 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:05.056 23:00:49 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:32:05.056 23:00:49 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2880802 00:32:05.056 23:00:49 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2880802 ']' 00:32:05.056 23:00:49 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:05.056 23:00:49 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:05.056 23:00:49 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:05.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:05.056 23:00:49 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:05.056 23:00:49 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:05.056 [2024-07-15 23:00:49.539492] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:32:05.056 [2024-07-15 23:00:49.539566] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2880802 ] 00:32:05.056 [2024-07-15 23:00:49.678705] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:05.056 [2024-07-15 23:00:49.802942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:05.056 [2024-07-15 23:00:49.802947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:05.621 23:00:50 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:05.621 23:00:50 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:05.621 23:00:50 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:32:05.621 23:00:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:05.621 23:00:50 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:06.554 23:00:51 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:06.554 23:00:51 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:06.554 23:00:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:06.554 23:00:51 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:06.554 23:00:51 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:06.554 23:00:51 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:06.554 23:00:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:06.554 23:00:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:06.811 [ 00:32:06.811 { 00:32:06.811 "name": "Nvme0n1", 00:32:06.811 "aliases": [ 00:32:06.811 "01000000-0000-0000-5cd2-e43197705251" 00:32:06.811 ], 00:32:06.811 "product_name": "NVMe disk", 00:32:06.811 "block_size": 512, 00:32:06.811 "num_blocks": 15002931888, 00:32:06.811 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:06.811 "assigned_rate_limits": { 00:32:06.811 "rw_ios_per_sec": 0, 00:32:06.811 "rw_mbytes_per_sec": 0, 00:32:06.811 "r_mbytes_per_sec": 0, 00:32:06.811 "w_mbytes_per_sec": 0 00:32:06.811 }, 00:32:06.811 "claimed": false, 00:32:06.811 "zoned": false, 00:32:06.811 "supported_io_types": { 00:32:06.811 "read": true, 00:32:06.811 "write": true, 00:32:06.811 "unmap": true, 00:32:06.811 "flush": true, 00:32:06.811 "reset": true, 00:32:06.811 "nvme_admin": true, 00:32:06.811 "nvme_io": true, 00:32:06.811 "nvme_io_md": false, 00:32:06.811 "write_zeroes": true, 00:32:06.811 "zcopy": false, 00:32:06.811 "get_zone_info": false, 00:32:06.812 "zone_management": false, 00:32:06.812 "zone_append": false, 00:32:06.812 "compare": false, 00:32:06.812 "compare_and_write": false, 00:32:06.812 "abort": true, 00:32:06.812 "seek_hole": false, 00:32:06.812 "seek_data": false, 00:32:06.812 "copy": false, 00:32:06.812 "nvme_iov_md": false 00:32:06.812 }, 00:32:06.812 "driver_specific": { 00:32:06.812 "nvme": [ 00:32:06.812 { 00:32:06.812 "pci_address": "0000:5e:00.0", 00:32:06.812 "trid": { 00:32:06.812 "trtype": "PCIe", 00:32:06.812 "traddr": "0000:5e:00.0" 00:32:06.812 }, 00:32:06.812 "ctrlr_data": { 00:32:06.812 "cntlid": 0, 00:32:06.812 "vendor_id": "0x8086", 00:32:06.812 "model_number": "INTEL SSDPF2KX076TZO", 00:32:06.812 "serial_number": "PHAC0301002G7P6CGN", 00:32:06.812 "firmware_revision": "JCV10200", 00:32:06.812 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:06.812 "oacs": { 00:32:06.812 "security": 1, 00:32:06.812 "format": 1, 00:32:06.812 "firmware": 1, 00:32:06.812 "ns_manage": 1 00:32:06.812 }, 00:32:06.812 "multi_ctrlr": false, 00:32:06.812 "ana_reporting": false 00:32:06.812 }, 00:32:06.812 "vs": { 00:32:06.812 "nvme_version": "1.3" 00:32:06.812 }, 00:32:06.812 "ns_data": { 00:32:06.812 "id": 1, 00:32:06.812 "can_share": false 00:32:06.812 }, 00:32:06.812 "security": { 00:32:06.812 "opal": true 00:32:06.812 } 00:32:06.812 } 00:32:06.812 ], 00:32:06.812 "mp_policy": "active_passive" 00:32:06.812 } 00:32:06.812 } 00:32:06.812 ] 00:32:06.812 23:00:51 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:06.812 23:00:51 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:09.343 f1910ac6-abcb-4286-ba81-d0a86dcf55e2 00:32:09.343 23:00:54 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:09.603 9db29302-ae68-4528-a8a1-5e7ab80bfe95 00:32:09.603 23:00:54 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:09.603 23:00:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:09.603 23:00:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:09.603 23:00:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:09.603 23:00:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:09.603 23:00:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:09.603 23:00:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:09.861 23:00:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:10.120 [ 00:32:10.120 { 00:32:10.120 "name": "9db29302-ae68-4528-a8a1-5e7ab80bfe95", 00:32:10.120 "aliases": [ 00:32:10.120 "lvs0/lv0" 00:32:10.120 ], 00:32:10.120 "product_name": "Logical Volume", 00:32:10.120 "block_size": 512, 00:32:10.120 "num_blocks": 204800, 00:32:10.120 "uuid": "9db29302-ae68-4528-a8a1-5e7ab80bfe95", 00:32:10.120 "assigned_rate_limits": { 00:32:10.120 "rw_ios_per_sec": 0, 00:32:10.120 "rw_mbytes_per_sec": 0, 00:32:10.120 "r_mbytes_per_sec": 0, 00:32:10.120 "w_mbytes_per_sec": 0 00:32:10.120 }, 00:32:10.120 "claimed": false, 00:32:10.120 "zoned": false, 00:32:10.120 "supported_io_types": { 00:32:10.120 "read": true, 00:32:10.120 "write": true, 00:32:10.120 "unmap": true, 00:32:10.120 "flush": false, 00:32:10.120 "reset": true, 00:32:10.120 "nvme_admin": false, 00:32:10.120 "nvme_io": false, 00:32:10.120 "nvme_io_md": false, 00:32:10.120 "write_zeroes": true, 00:32:10.120 "zcopy": false, 00:32:10.120 "get_zone_info": false, 00:32:10.120 "zone_management": false, 00:32:10.120 "zone_append": false, 00:32:10.120 "compare": false, 00:32:10.120 "compare_and_write": false, 00:32:10.120 "abort": false, 00:32:10.120 "seek_hole": true, 00:32:10.120 "seek_data": true, 00:32:10.120 "copy": false, 00:32:10.120 "nvme_iov_md": false 00:32:10.120 }, 00:32:10.120 "driver_specific": { 00:32:10.120 "lvol": { 00:32:10.120 "lvol_store_uuid": "f1910ac6-abcb-4286-ba81-d0a86dcf55e2", 00:32:10.120 "base_bdev": "Nvme0n1", 00:32:10.120 "thin_provision": true, 00:32:10.120 "num_allocated_clusters": 0, 00:32:10.120 "snapshot": false, 00:32:10.120 "clone": false, 00:32:10.120 "esnap_clone": false 00:32:10.120 } 00:32:10.120 } 00:32:10.120 } 00:32:10.120 ] 00:32:10.120 23:00:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:10.120 23:00:54 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:32:10.121 23:00:54 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:32:10.379 [2024-07-15 23:00:55.059054] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:10.379 COMP_lvs0/lv0 00:32:10.380 23:00:55 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:10.380 23:00:55 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:10.380 23:00:55 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:10.380 23:00:55 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:10.380 23:00:55 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:10.380 23:00:55 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:10.380 23:00:55 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:10.638 23:00:55 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:10.898 [ 00:32:10.898 { 00:32:10.898 "name": "COMP_lvs0/lv0", 00:32:10.898 "aliases": [ 00:32:10.898 "0802177f-e1eb-5560-ae5b-2827f13e640e" 00:32:10.898 ], 00:32:10.898 "product_name": "compress", 00:32:10.898 "block_size": 4096, 00:32:10.898 "num_blocks": 25088, 00:32:10.898 "uuid": "0802177f-e1eb-5560-ae5b-2827f13e640e", 00:32:10.898 "assigned_rate_limits": { 00:32:10.898 "rw_ios_per_sec": 0, 00:32:10.898 "rw_mbytes_per_sec": 0, 00:32:10.898 "r_mbytes_per_sec": 0, 00:32:10.898 "w_mbytes_per_sec": 0 00:32:10.898 }, 00:32:10.898 "claimed": false, 00:32:10.898 "zoned": false, 00:32:10.898 "supported_io_types": { 00:32:10.898 "read": true, 00:32:10.898 "write": true, 00:32:10.898 "unmap": false, 00:32:10.898 "flush": false, 00:32:10.898 "reset": false, 00:32:10.898 "nvme_admin": false, 00:32:10.898 "nvme_io": false, 00:32:10.898 "nvme_io_md": false, 00:32:10.898 "write_zeroes": true, 00:32:10.898 "zcopy": false, 00:32:10.898 "get_zone_info": false, 00:32:10.898 "zone_management": false, 00:32:10.898 "zone_append": false, 00:32:10.898 "compare": false, 00:32:10.898 "compare_and_write": false, 00:32:10.898 "abort": false, 00:32:10.898 "seek_hole": false, 00:32:10.898 "seek_data": false, 00:32:10.898 "copy": false, 00:32:10.898 "nvme_iov_md": false 00:32:10.898 }, 00:32:10.898 "driver_specific": { 00:32:10.898 "compress": { 00:32:10.898 "name": "COMP_lvs0/lv0", 00:32:10.898 "base_bdev_name": "9db29302-ae68-4528-a8a1-5e7ab80bfe95" 00:32:10.898 } 00:32:10.898 } 00:32:10.898 } 00:32:10.898 ] 00:32:10.898 23:00:55 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:10.898 23:00:55 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:10.898 Running I/O for 3 seconds... 00:32:14.181 00:32:14.181 Latency(us) 00:32:14.181 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:14.181 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:14.181 Verification LBA range: start 0x0 length 0x3100 00:32:14.181 COMP_lvs0/lv0 : 3.01 2148.80 8.39 0.00 0.00 14810.68 1089.89 13335.15 00:32:14.181 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:14.181 Verification LBA range: start 0x3100 length 0x3100 00:32:14.181 COMP_lvs0/lv0 : 3.01 2146.20 8.38 0.00 0.00 14783.00 1396.20 12822.26 00:32:14.181 =================================================================================================================== 00:32:14.181 Total : 4295.00 16.78 0.00 0.00 14796.85 1089.89 13335.15 00:32:14.181 0 00:32:14.181 23:00:58 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:32:14.181 23:00:58 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:14.181 23:00:59 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:14.439 23:00:59 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:14.439 23:00:59 compress_isal -- compress/compress.sh@78 -- # killprocess 2880802 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2880802 ']' 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2880802 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2880802 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2880802' 00:32:14.439 killing process with pid 2880802 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@967 -- # kill 2880802 00:32:14.439 Received shutdown signal, test time was about 3.000000 seconds 00:32:14.439 00:32:14.439 Latency(us) 00:32:14.439 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:14.439 =================================================================================================================== 00:32:14.439 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:14.439 23:00:59 compress_isal -- common/autotest_common.sh@972 -- # wait 2880802 00:32:17.724 23:01:02 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:32:17.724 23:01:02 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:32:17.724 23:01:02 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2882443 00:32:17.724 23:01:02 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:17.724 23:01:02 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:32:17.724 23:01:02 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2882443 00:32:17.724 23:01:02 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2882443 ']' 00:32:17.724 23:01:02 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:17.724 23:01:02 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:17.724 23:01:02 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:17.724 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:17.724 23:01:02 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:17.724 23:01:02 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:17.725 [2024-07-15 23:01:02.412877] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:32:17.725 [2024-07-15 23:01:02.412956] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2882443 ] 00:32:17.725 [2024-07-15 23:01:02.527247] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:17.984 [2024-07-15 23:01:02.645902] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:17.984 [2024-07-15 23:01:02.648970] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:17.984 [2024-07-15 23:01:02.648972] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:17.984 23:01:02 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:18.243 23:01:02 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:32:18.243 23:01:02 compress_isal -- compress/compress.sh@58 -- # create_vols 00:32:18.243 23:01:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:18.243 23:01:02 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:18.856 23:01:03 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:18.856 23:01:03 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:18.856 23:01:03 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:18.856 23:01:03 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:18.856 23:01:03 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:18.856 23:01:03 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:18.856 23:01:03 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:19.114 23:01:03 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:19.114 [ 00:32:19.114 { 00:32:19.114 "name": "Nvme0n1", 00:32:19.114 "aliases": [ 00:32:19.114 "01000000-0000-0000-5cd2-e43197705251" 00:32:19.114 ], 00:32:19.114 "product_name": "NVMe disk", 00:32:19.114 "block_size": 512, 00:32:19.114 "num_blocks": 15002931888, 00:32:19.114 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:32:19.114 "assigned_rate_limits": { 00:32:19.114 "rw_ios_per_sec": 0, 00:32:19.114 "rw_mbytes_per_sec": 0, 00:32:19.114 "r_mbytes_per_sec": 0, 00:32:19.114 "w_mbytes_per_sec": 0 00:32:19.114 }, 00:32:19.114 "claimed": false, 00:32:19.114 "zoned": false, 00:32:19.114 "supported_io_types": { 00:32:19.114 "read": true, 00:32:19.114 "write": true, 00:32:19.114 "unmap": true, 00:32:19.114 "flush": true, 00:32:19.114 "reset": true, 00:32:19.114 "nvme_admin": true, 00:32:19.114 "nvme_io": true, 00:32:19.114 "nvme_io_md": false, 00:32:19.114 "write_zeroes": true, 00:32:19.114 "zcopy": false, 00:32:19.114 "get_zone_info": false, 00:32:19.114 "zone_management": false, 00:32:19.114 "zone_append": false, 00:32:19.114 "compare": false, 00:32:19.114 "compare_and_write": false, 00:32:19.114 "abort": true, 00:32:19.114 "seek_hole": false, 00:32:19.114 "seek_data": false, 00:32:19.114 "copy": false, 00:32:19.114 "nvme_iov_md": false 00:32:19.114 }, 00:32:19.114 "driver_specific": { 00:32:19.114 "nvme": [ 00:32:19.114 { 00:32:19.114 "pci_address": "0000:5e:00.0", 00:32:19.114 "trid": { 00:32:19.114 "trtype": "PCIe", 00:32:19.114 "traddr": "0000:5e:00.0" 00:32:19.114 }, 00:32:19.114 "ctrlr_data": { 00:32:19.114 "cntlid": 0, 00:32:19.114 "vendor_id": "0x8086", 00:32:19.114 "model_number": "INTEL SSDPF2KX076TZO", 00:32:19.114 "serial_number": "PHAC0301002G7P6CGN", 00:32:19.114 "firmware_revision": "JCV10200", 00:32:19.114 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:32:19.114 "oacs": { 00:32:19.114 "security": 1, 00:32:19.114 "format": 1, 00:32:19.114 "firmware": 1, 00:32:19.114 "ns_manage": 1 00:32:19.114 }, 00:32:19.114 "multi_ctrlr": false, 00:32:19.114 "ana_reporting": false 00:32:19.114 }, 00:32:19.114 "vs": { 00:32:19.114 "nvme_version": "1.3" 00:32:19.114 }, 00:32:19.114 "ns_data": { 00:32:19.114 "id": 1, 00:32:19.114 "can_share": false 00:32:19.114 }, 00:32:19.114 "security": { 00:32:19.114 "opal": true 00:32:19.114 } 00:32:19.114 } 00:32:19.114 ], 00:32:19.114 "mp_policy": "active_passive" 00:32:19.114 } 00:32:19.114 } 00:32:19.114 ] 00:32:19.371 23:01:04 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:19.371 23:01:04 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:21.898 d8aacd29-ea94-4ce4-8ad6-df857b9c35e7 00:32:21.898 23:01:06 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:21.898 b39ad4c2-728f-4883-a70d-83641e97c0d3 00:32:21.898 23:01:06 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:21.898 23:01:06 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:21.898 23:01:06 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:21.898 23:01:06 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:21.898 23:01:06 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:21.898 23:01:06 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:21.898 23:01:06 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:22.154 23:01:06 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:22.412 [ 00:32:22.412 { 00:32:22.412 "name": "b39ad4c2-728f-4883-a70d-83641e97c0d3", 00:32:22.412 "aliases": [ 00:32:22.412 "lvs0/lv0" 00:32:22.412 ], 00:32:22.412 "product_name": "Logical Volume", 00:32:22.412 "block_size": 512, 00:32:22.412 "num_blocks": 204800, 00:32:22.412 "uuid": "b39ad4c2-728f-4883-a70d-83641e97c0d3", 00:32:22.412 "assigned_rate_limits": { 00:32:22.412 "rw_ios_per_sec": 0, 00:32:22.412 "rw_mbytes_per_sec": 0, 00:32:22.412 "r_mbytes_per_sec": 0, 00:32:22.412 "w_mbytes_per_sec": 0 00:32:22.412 }, 00:32:22.412 "claimed": false, 00:32:22.412 "zoned": false, 00:32:22.412 "supported_io_types": { 00:32:22.412 "read": true, 00:32:22.412 "write": true, 00:32:22.412 "unmap": true, 00:32:22.412 "flush": false, 00:32:22.412 "reset": true, 00:32:22.412 "nvme_admin": false, 00:32:22.412 "nvme_io": false, 00:32:22.412 "nvme_io_md": false, 00:32:22.412 "write_zeroes": true, 00:32:22.412 "zcopy": false, 00:32:22.412 "get_zone_info": false, 00:32:22.412 "zone_management": false, 00:32:22.412 "zone_append": false, 00:32:22.412 "compare": false, 00:32:22.412 "compare_and_write": false, 00:32:22.412 "abort": false, 00:32:22.412 "seek_hole": true, 00:32:22.412 "seek_data": true, 00:32:22.412 "copy": false, 00:32:22.412 "nvme_iov_md": false 00:32:22.412 }, 00:32:22.412 "driver_specific": { 00:32:22.412 "lvol": { 00:32:22.412 "lvol_store_uuid": "d8aacd29-ea94-4ce4-8ad6-df857b9c35e7", 00:32:22.412 "base_bdev": "Nvme0n1", 00:32:22.412 "thin_provision": true, 00:32:22.412 "num_allocated_clusters": 0, 00:32:22.412 "snapshot": false, 00:32:22.412 "clone": false, 00:32:22.412 "esnap_clone": false 00:32:22.412 } 00:32:22.412 } 00:32:22.412 } 00:32:22.412 ] 00:32:22.412 23:01:07 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:22.412 23:01:07 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:32:22.412 23:01:07 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:32:22.670 [2024-07-15 23:01:07.469761] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:22.670 COMP_lvs0/lv0 00:32:22.670 23:01:07 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:22.670 23:01:07 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:22.670 23:01:07 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:22.670 23:01:07 compress_isal -- common/autotest_common.sh@899 -- # local i 00:32:22.670 23:01:07 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:22.670 23:01:07 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:22.670 23:01:07 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:22.928 23:01:07 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:23.186 [ 00:32:23.186 { 00:32:23.186 "name": "COMP_lvs0/lv0", 00:32:23.186 "aliases": [ 00:32:23.186 "145c2c47-bbf9-5d95-9664-c85096bd0454" 00:32:23.186 ], 00:32:23.186 "product_name": "compress", 00:32:23.186 "block_size": 512, 00:32:23.186 "num_blocks": 200704, 00:32:23.186 "uuid": "145c2c47-bbf9-5d95-9664-c85096bd0454", 00:32:23.186 "assigned_rate_limits": { 00:32:23.186 "rw_ios_per_sec": 0, 00:32:23.186 "rw_mbytes_per_sec": 0, 00:32:23.186 "r_mbytes_per_sec": 0, 00:32:23.186 "w_mbytes_per_sec": 0 00:32:23.186 }, 00:32:23.186 "claimed": false, 00:32:23.186 "zoned": false, 00:32:23.186 "supported_io_types": { 00:32:23.186 "read": true, 00:32:23.186 "write": true, 00:32:23.186 "unmap": false, 00:32:23.187 "flush": false, 00:32:23.187 "reset": false, 00:32:23.187 "nvme_admin": false, 00:32:23.187 "nvme_io": false, 00:32:23.187 "nvme_io_md": false, 00:32:23.187 "write_zeroes": true, 00:32:23.187 "zcopy": false, 00:32:23.187 "get_zone_info": false, 00:32:23.187 "zone_management": false, 00:32:23.187 "zone_append": false, 00:32:23.187 "compare": false, 00:32:23.187 "compare_and_write": false, 00:32:23.187 "abort": false, 00:32:23.187 "seek_hole": false, 00:32:23.187 "seek_data": false, 00:32:23.187 "copy": false, 00:32:23.187 "nvme_iov_md": false 00:32:23.187 }, 00:32:23.187 "driver_specific": { 00:32:23.187 "compress": { 00:32:23.187 "name": "COMP_lvs0/lv0", 00:32:23.187 "base_bdev_name": "b39ad4c2-728f-4883-a70d-83641e97c0d3" 00:32:23.187 } 00:32:23.187 } 00:32:23.187 } 00:32:23.187 ] 00:32:23.187 23:01:08 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:32:23.187 23:01:08 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:23.445 I/O targets: 00:32:23.445 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:32:23.445 00:32:23.445 00:32:23.445 CUnit - A unit testing framework for C - Version 2.1-3 00:32:23.445 http://cunit.sourceforge.net/ 00:32:23.445 00:32:23.445 00:32:23.445 Suite: bdevio tests on: COMP_lvs0/lv0 00:32:23.445 Test: blockdev write read block ...passed 00:32:23.445 Test: blockdev write zeroes read block ...passed 00:32:23.445 Test: blockdev write zeroes read no split ...passed 00:32:23.445 Test: blockdev write zeroes read split ...passed 00:32:23.445 Test: blockdev write zeroes read split partial ...passed 00:32:23.445 Test: blockdev reset ...[2024-07-15 23:01:08.249578] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:32:23.445 passed 00:32:23.445 Test: blockdev write read 8 blocks ...passed 00:32:23.445 Test: blockdev write read size > 128k ...passed 00:32:23.445 Test: blockdev write read invalid size ...passed 00:32:23.445 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:23.445 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:23.445 Test: blockdev write read max offset ...passed 00:32:23.445 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:23.445 Test: blockdev writev readv 8 blocks ...passed 00:32:23.445 Test: blockdev writev readv 30 x 1block ...passed 00:32:23.445 Test: blockdev writev readv block ...passed 00:32:23.445 Test: blockdev writev readv size > 128k ...passed 00:32:23.445 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:23.445 Test: blockdev comparev and writev ...passed 00:32:23.445 Test: blockdev nvme passthru rw ...passed 00:32:23.445 Test: blockdev nvme passthru vendor specific ...passed 00:32:23.445 Test: blockdev nvme admin passthru ...passed 00:32:23.445 Test: blockdev copy ...passed 00:32:23.445 00:32:23.445 Run Summary: Type Total Ran Passed Failed Inactive 00:32:23.445 suites 1 1 n/a 0 0 00:32:23.445 tests 23 23 23 0 0 00:32:23.445 asserts 130 130 130 0 n/a 00:32:23.445 00:32:23.445 Elapsed time = 0.290 seconds 00:32:23.445 0 00:32:23.445 23:01:08 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:32:23.445 23:01:08 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:23.707 23:01:08 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:23.967 23:01:08 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:32:23.967 23:01:08 compress_isal -- compress/compress.sh@62 -- # killprocess 2882443 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2882443 ']' 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2882443 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@953 -- # uname 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2882443 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2882443' 00:32:23.967 killing process with pid 2882443 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@967 -- # kill 2882443 00:32:23.967 23:01:08 compress_isal -- common/autotest_common.sh@972 -- # wait 2882443 00:32:27.246 23:01:11 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:32:27.246 23:01:11 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:32:27.246 00:32:27.246 real 0m48.582s 00:32:27.246 user 1m53.424s 00:32:27.246 sys 0m4.556s 00:32:27.246 23:01:11 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:27.246 23:01:11 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:32:27.246 ************************************ 00:32:27.246 END TEST compress_isal 00:32:27.246 ************************************ 00:32:27.246 23:01:11 -- common/autotest_common.sh@1142 -- # return 0 00:32:27.246 23:01:11 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:32:27.246 23:01:11 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:32:27.246 23:01:11 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:27.246 23:01:11 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:27.246 23:01:11 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:27.246 23:01:11 -- common/autotest_common.sh@10 -- # set +x 00:32:27.246 ************************************ 00:32:27.246 START TEST blockdev_crypto_aesni 00:32:27.246 ************************************ 00:32:27.246 23:01:11 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:32:27.246 * Looking for test storage... 00:32:27.246 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2883734 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:27.246 23:01:12 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2883734 00:32:27.246 23:01:12 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 2883734 ']' 00:32:27.246 23:01:12 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:27.246 23:01:12 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:27.246 23:01:12 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:27.246 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:27.246 23:01:12 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:27.246 23:01:12 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:27.246 [2024-07-15 23:01:12.133709] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:32:27.246 [2024-07-15 23:01:12.133786] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2883734 ] 00:32:27.507 [2024-07-15 23:01:12.256696] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:27.507 [2024-07-15 23:01:12.357247] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:28.445 23:01:13 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:28.445 23:01:13 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:32:28.445 23:01:13 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:28.445 23:01:13 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:32:28.445 23:01:13 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:32:28.445 23:01:13 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:28.445 23:01:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:28.445 [2024-07-15 23:01:13.071511] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:28.445 [2024-07-15 23:01:13.079544] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:28.446 [2024-07-15 23:01:13.087560] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:28.446 [2024-07-15 23:01:13.151273] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:30.984 true 00:32:30.984 true 00:32:30.984 true 00:32:30.984 true 00:32:30.984 Malloc0 00:32:30.984 Malloc1 00:32:30.984 Malloc2 00:32:30.984 Malloc3 00:32:30.984 [2024-07-15 23:01:15.539428] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:30.984 crypto_ram 00:32:30.984 [2024-07-15 23:01:15.547444] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:30.984 crypto_ram2 00:32:30.984 [2024-07-15 23:01:15.555466] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:30.984 crypto_ram3 00:32:30.984 [2024-07-15 23:01:15.563488] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:30.984 crypto_ram4 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:30.984 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:30.984 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:30.985 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b33bc2c9-3aed-54ab-813f-eebfea49634e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b33bc2c9-3aed-54ab-813f-eebfea49634e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3db66c88-7b48-5a5c-b149-c7962ef5c284"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3db66c88-7b48-5a5c-b149-c7962ef5c284",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f7ba4b29-efff-558f-8a08-c582d752cf19"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f7ba4b29-efff-558f-8a08-c582d752cf19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:30.985 ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "ad7a0f55-e450-5d92-86a7-9611fbf28bf8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ad7a0f55-e450-5d92-86a7-9611fbf28bf8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:30.985 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:30.985 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:30.985 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:30.985 23:01:15 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 2883734 00:32:30.985 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 2883734 ']' 00:32:30.985 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 2883734 00:32:30.985 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:32:30.985 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:30.985 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2883734 00:32:31.242 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:31.242 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:31.242 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2883734' 00:32:31.242 killing process with pid 2883734 00:32:31.242 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 2883734 00:32:31.242 23:01:15 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 2883734 00:32:31.808 23:01:16 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:31.808 23:01:16 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:31.808 23:01:16 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:31.808 23:01:16 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:31.808 23:01:16 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:31.808 ************************************ 00:32:31.808 START TEST bdev_hello_world 00:32:31.808 ************************************ 00:32:31.808 23:01:16 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:31.808 [2024-07-15 23:01:16.590358] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:32:31.808 [2024-07-15 23:01:16.590421] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2884406 ] 00:32:32.066 [2024-07-15 23:01:16.719300] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:32.066 [2024-07-15 23:01:16.824181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:32.066 [2024-07-15 23:01:16.845411] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:32.066 [2024-07-15 23:01:16.853439] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:32.066 [2024-07-15 23:01:16.861464] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:32.066 [2024-07-15 23:01:16.973086] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:34.598 [2024-07-15 23:01:19.213323] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:34.598 [2024-07-15 23:01:19.213394] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:34.598 [2024-07-15 23:01:19.213410] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:34.598 [2024-07-15 23:01:19.221341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:34.598 [2024-07-15 23:01:19.221363] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:34.598 [2024-07-15 23:01:19.221380] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:34.598 [2024-07-15 23:01:19.229362] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:34.598 [2024-07-15 23:01:19.229382] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:34.598 [2024-07-15 23:01:19.229394] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:34.598 [2024-07-15 23:01:19.237381] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:34.598 [2024-07-15 23:01:19.237401] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:34.598 [2024-07-15 23:01:19.237412] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:34.598 [2024-07-15 23:01:19.315153] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:34.598 [2024-07-15 23:01:19.315197] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:34.598 [2024-07-15 23:01:19.315217] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:34.598 [2024-07-15 23:01:19.316488] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:34.598 [2024-07-15 23:01:19.316561] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:34.598 [2024-07-15 23:01:19.316578] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:34.599 [2024-07-15 23:01:19.316624] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:34.599 00:32:34.599 [2024-07-15 23:01:19.316645] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:34.857 00:32:34.857 real 0m3.170s 00:32:34.857 user 0m2.745s 00:32:34.857 sys 0m0.383s 00:32:34.857 23:01:19 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:34.857 23:01:19 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:34.857 ************************************ 00:32:34.857 END TEST bdev_hello_world 00:32:34.857 ************************************ 00:32:34.857 23:01:19 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:34.857 23:01:19 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:34.857 23:01:19 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:34.857 23:01:19 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:34.857 23:01:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:35.115 ************************************ 00:32:35.115 START TEST bdev_bounds 00:32:35.115 ************************************ 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2884780 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2884780' 00:32:35.115 Process bdevio pid: 2884780 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2884780 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2884780 ']' 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:35.115 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:35.115 23:01:19 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:35.115 [2024-07-15 23:01:19.848562] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:32:35.115 [2024-07-15 23:01:19.848627] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2884780 ] 00:32:35.115 [2024-07-15 23:01:19.975516] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:35.372 [2024-07-15 23:01:20.087265] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:35.372 [2024-07-15 23:01:20.087366] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:35.372 [2024-07-15 23:01:20.087367] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:35.372 [2024-07-15 23:01:20.108796] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:35.372 [2024-07-15 23:01:20.116822] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:35.372 [2024-07-15 23:01:20.124840] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:35.372 [2024-07-15 23:01:20.234206] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:37.901 [2024-07-15 23:01:22.443765] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:37.901 [2024-07-15 23:01:22.443845] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:37.901 [2024-07-15 23:01:22.443860] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.901 [2024-07-15 23:01:22.451783] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:37.901 [2024-07-15 23:01:22.451804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:37.901 [2024-07-15 23:01:22.451816] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.901 [2024-07-15 23:01:22.459805] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:37.901 [2024-07-15 23:01:22.459827] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:37.901 [2024-07-15 23:01:22.459838] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.901 [2024-07-15 23:01:22.467832] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:37.901 [2024-07-15 23:01:22.467851] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:37.901 [2024-07-15 23:01:22.467862] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.902 23:01:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:37.902 23:01:22 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:37.902 23:01:22 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:37.902 I/O targets: 00:32:37.902 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:37.902 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:32:37.902 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:37.902 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:32:37.902 00:32:37.902 00:32:37.902 CUnit - A unit testing framework for C - Version 2.1-3 00:32:37.902 http://cunit.sourceforge.net/ 00:32:37.902 00:32:37.902 00:32:37.902 Suite: bdevio tests on: crypto_ram4 00:32:37.902 Test: blockdev write read block ...passed 00:32:37.902 Test: blockdev write zeroes read block ...passed 00:32:37.902 Test: blockdev write zeroes read no split ...passed 00:32:37.902 Test: blockdev write zeroes read split ...passed 00:32:37.902 Test: blockdev write zeroes read split partial ...passed 00:32:37.902 Test: blockdev reset ...passed 00:32:37.902 Test: blockdev write read 8 blocks ...passed 00:32:37.902 Test: blockdev write read size > 128k ...passed 00:32:37.902 Test: blockdev write read invalid size ...passed 00:32:37.902 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:37.902 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:37.902 Test: blockdev write read max offset ...passed 00:32:37.902 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:37.902 Test: blockdev writev readv 8 blocks ...passed 00:32:37.902 Test: blockdev writev readv 30 x 1block ...passed 00:32:37.902 Test: blockdev writev readv block ...passed 00:32:37.902 Test: blockdev writev readv size > 128k ...passed 00:32:37.902 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:37.902 Test: blockdev comparev and writev ...passed 00:32:37.902 Test: blockdev nvme passthru rw ...passed 00:32:37.902 Test: blockdev nvme passthru vendor specific ...passed 00:32:37.902 Test: blockdev nvme admin passthru ...passed 00:32:37.902 Test: blockdev copy ...passed 00:32:37.902 Suite: bdevio tests on: crypto_ram3 00:32:37.902 Test: blockdev write read block ...passed 00:32:37.902 Test: blockdev write zeroes read block ...passed 00:32:37.902 Test: blockdev write zeroes read no split ...passed 00:32:37.902 Test: blockdev write zeroes read split ...passed 00:32:38.161 Test: blockdev write zeroes read split partial ...passed 00:32:38.161 Test: blockdev reset ...passed 00:32:38.161 Test: blockdev write read 8 blocks ...passed 00:32:38.161 Test: blockdev write read size > 128k ...passed 00:32:38.161 Test: blockdev write read invalid size ...passed 00:32:38.161 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:38.161 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:38.161 Test: blockdev write read max offset ...passed 00:32:38.161 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:38.161 Test: blockdev writev readv 8 blocks ...passed 00:32:38.161 Test: blockdev writev readv 30 x 1block ...passed 00:32:38.161 Test: blockdev writev readv block ...passed 00:32:38.161 Test: blockdev writev readv size > 128k ...passed 00:32:38.161 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:38.161 Test: blockdev comparev and writev ...passed 00:32:38.161 Test: blockdev nvme passthru rw ...passed 00:32:38.161 Test: blockdev nvme passthru vendor specific ...passed 00:32:38.161 Test: blockdev nvme admin passthru ...passed 00:32:38.161 Test: blockdev copy ...passed 00:32:38.161 Suite: bdevio tests on: crypto_ram2 00:32:38.161 Test: blockdev write read block ...passed 00:32:38.161 Test: blockdev write zeroes read block ...passed 00:32:38.161 Test: blockdev write zeroes read no split ...passed 00:32:38.161 Test: blockdev write zeroes read split ...passed 00:32:38.419 Test: blockdev write zeroes read split partial ...passed 00:32:38.419 Test: blockdev reset ...passed 00:32:38.419 Test: blockdev write read 8 blocks ...passed 00:32:38.419 Test: blockdev write read size > 128k ...passed 00:32:38.419 Test: blockdev write read invalid size ...passed 00:32:38.419 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:38.419 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:38.419 Test: blockdev write read max offset ...passed 00:32:38.419 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:38.419 Test: blockdev writev readv 8 blocks ...passed 00:32:38.419 Test: blockdev writev readv 30 x 1block ...passed 00:32:38.419 Test: blockdev writev readv block ...passed 00:32:38.419 Test: blockdev writev readv size > 128k ...passed 00:32:38.419 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:38.419 Test: blockdev comparev and writev ...passed 00:32:38.419 Test: blockdev nvme passthru rw ...passed 00:32:38.419 Test: blockdev nvme passthru vendor specific ...passed 00:32:38.419 Test: blockdev nvme admin passthru ...passed 00:32:38.419 Test: blockdev copy ...passed 00:32:38.419 Suite: bdevio tests on: crypto_ram 00:32:38.419 Test: blockdev write read block ...passed 00:32:38.419 Test: blockdev write zeroes read block ...passed 00:32:38.419 Test: blockdev write zeroes read no split ...passed 00:32:38.677 Test: blockdev write zeroes read split ...passed 00:32:38.677 Test: blockdev write zeroes read split partial ...passed 00:32:38.677 Test: blockdev reset ...passed 00:32:38.677 Test: blockdev write read 8 blocks ...passed 00:32:38.677 Test: blockdev write read size > 128k ...passed 00:32:38.677 Test: blockdev write read invalid size ...passed 00:32:38.677 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:38.677 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:38.677 Test: blockdev write read max offset ...passed 00:32:38.677 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:38.677 Test: blockdev writev readv 8 blocks ...passed 00:32:38.677 Test: blockdev writev readv 30 x 1block ...passed 00:32:38.677 Test: blockdev writev readv block ...passed 00:32:38.677 Test: blockdev writev readv size > 128k ...passed 00:32:38.677 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:38.677 Test: blockdev comparev and writev ...passed 00:32:38.677 Test: blockdev nvme passthru rw ...passed 00:32:38.677 Test: blockdev nvme passthru vendor specific ...passed 00:32:38.677 Test: blockdev nvme admin passthru ...passed 00:32:38.677 Test: blockdev copy ...passed 00:32:38.677 00:32:38.677 Run Summary: Type Total Ran Passed Failed Inactive 00:32:38.677 suites 4 4 n/a 0 0 00:32:38.677 tests 92 92 92 0 0 00:32:38.677 asserts 520 520 520 0 n/a 00:32:38.677 00:32:38.677 Elapsed time = 1.641 seconds 00:32:38.677 0 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2884780 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2884780 ']' 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2884780 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2884780 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2884780' 00:32:38.677 killing process with pid 2884780 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2884780 00:32:38.677 23:01:23 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2884780 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:39.243 00:32:39.243 real 0m4.224s 00:32:39.243 user 0m11.270s 00:32:39.243 sys 0m0.564s 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:39.243 ************************************ 00:32:39.243 END TEST bdev_bounds 00:32:39.243 ************************************ 00:32:39.243 23:01:24 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:39.243 23:01:24 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:39.243 23:01:24 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:39.243 23:01:24 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:39.243 23:01:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:39.243 ************************************ 00:32:39.243 START TEST bdev_nbd 00:32:39.243 ************************************ 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2885342 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2885342 /var/tmp/spdk-nbd.sock 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2885342 ']' 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:39.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:39.243 23:01:24 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:39.502 [2024-07-15 23:01:24.165105] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:32:39.502 [2024-07-15 23:01:24.165170] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:39.502 [2024-07-15 23:01:24.292512] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:39.502 [2024-07-15 23:01:24.396523] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:39.761 [2024-07-15 23:01:24.417791] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:39.761 [2024-07-15 23:01:24.425812] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:39.761 [2024-07-15 23:01:24.433831] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:39.761 [2024-07-15 23:01:24.535564] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:42.368 [2024-07-15 23:01:26.760825] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:42.368 [2024-07-15 23:01:26.760900] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:42.368 [2024-07-15 23:01:26.760915] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:42.368 [2024-07-15 23:01:26.768830] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:42.368 [2024-07-15 23:01:26.768850] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:42.368 [2024-07-15 23:01:26.768862] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:42.368 [2024-07-15 23:01:26.776850] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:42.368 [2024-07-15 23:01:26.776869] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:42.368 [2024-07-15 23:01:26.776880] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:42.368 [2024-07-15 23:01:26.784871] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:42.368 [2024-07-15 23:01:26.784894] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:42.368 [2024-07-15 23:01:26.784905] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:42.368 23:01:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:42.368 1+0 records in 00:32:42.368 1+0 records out 00:32:42.368 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283435 s, 14.5 MB/s 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:42.368 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:42.627 1+0 records in 00:32:42.627 1+0 records out 00:32:42.627 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337934 s, 12.1 MB/s 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:42.627 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:42.886 1+0 records in 00:32:42.886 1+0 records out 00:32:42.886 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000323141 s, 12.7 MB/s 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:42.886 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:32:43.145 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:43.146 1+0 records in 00:32:43.146 1+0 records out 00:32:43.146 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334257 s, 12.3 MB/s 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:43.146 23:01:27 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd0", 00:32:43.404 "bdev_name": "crypto_ram" 00:32:43.404 }, 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd1", 00:32:43.404 "bdev_name": "crypto_ram2" 00:32:43.404 }, 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd2", 00:32:43.404 "bdev_name": "crypto_ram3" 00:32:43.404 }, 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd3", 00:32:43.404 "bdev_name": "crypto_ram4" 00:32:43.404 } 00:32:43.404 ]' 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd0", 00:32:43.404 "bdev_name": "crypto_ram" 00:32:43.404 }, 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd1", 00:32:43.404 "bdev_name": "crypto_ram2" 00:32:43.404 }, 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd2", 00:32:43.404 "bdev_name": "crypto_ram3" 00:32:43.404 }, 00:32:43.404 { 00:32:43.404 "nbd_device": "/dev/nbd3", 00:32:43.404 "bdev_name": "crypto_ram4" 00:32:43.404 } 00:32:43.404 ]' 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:43.404 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:43.678 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:43.936 23:01:28 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:44.194 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:44.452 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:44.711 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:44.711 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:44.711 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:44.711 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:44.969 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:44.969 /dev/nbd0 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:45.229 1+0 records in 00:32:45.229 1+0 records out 00:32:45.229 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227412 s, 18.0 MB/s 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:45.229 23:01:29 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:32:45.488 /dev/nbd1 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:45.488 1+0 records in 00:32:45.488 1+0 records out 00:32:45.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000337437 s, 12.1 MB/s 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:45.488 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:32:45.747 /dev/nbd10 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:45.747 1+0 records in 00:32:45.747 1+0 records out 00:32:45.747 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000354703 s, 11.5 MB/s 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:45.747 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:32:46.006 /dev/nbd11 00:32:46.006 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:46.006 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:46.006 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:46.007 1+0 records in 00:32:46.007 1+0 records out 00:32:46.007 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000362371 s, 11.3 MB/s 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd0", 00:32:46.007 "bdev_name": "crypto_ram" 00:32:46.007 }, 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd1", 00:32:46.007 "bdev_name": "crypto_ram2" 00:32:46.007 }, 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd10", 00:32:46.007 "bdev_name": "crypto_ram3" 00:32:46.007 }, 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd11", 00:32:46.007 "bdev_name": "crypto_ram4" 00:32:46.007 } 00:32:46.007 ]' 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd0", 00:32:46.007 "bdev_name": "crypto_ram" 00:32:46.007 }, 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd1", 00:32:46.007 "bdev_name": "crypto_ram2" 00:32:46.007 }, 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd10", 00:32:46.007 "bdev_name": "crypto_ram3" 00:32:46.007 }, 00:32:46.007 { 00:32:46.007 "nbd_device": "/dev/nbd11", 00:32:46.007 "bdev_name": "crypto_ram4" 00:32:46.007 } 00:32:46.007 ]' 00:32:46.007 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:46.265 /dev/nbd1 00:32:46.265 /dev/nbd10 00:32:46.265 /dev/nbd11' 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:46.265 /dev/nbd1 00:32:46.265 /dev/nbd10 00:32:46.265 /dev/nbd11' 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:46.265 256+0 records in 00:32:46.265 256+0 records out 00:32:46.265 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109065 s, 96.1 MB/s 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:46.265 23:01:30 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:46.265 256+0 records in 00:32:46.265 256+0 records out 00:32:46.265 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.06122 s, 17.1 MB/s 00:32:46.265 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:46.265 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:46.265 256+0 records in 00:32:46.265 256+0 records out 00:32:46.265 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0649507 s, 16.1 MB/s 00:32:46.265 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:46.265 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:46.265 256+0 records in 00:32:46.265 256+0 records out 00:32:46.265 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0600039 s, 17.5 MB/s 00:32:46.265 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:46.265 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:46.523 256+0 records in 00:32:46.523 256+0 records out 00:32:46.523 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0575766 s, 18.2 MB/s 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:46.523 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:46.782 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:47.040 23:01:31 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:47.297 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:47.556 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:47.814 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:48.072 malloc_lvol_verify 00:32:48.072 23:01:32 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:48.330 ab9811e0-c9bd-487d-ad49-79bf852acf48 00:32:48.330 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:48.588 75eb3eaa-dbaa-4143-a451-110ddc7cb851 00:32:48.588 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:48.846 /dev/nbd0 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:48.846 mke2fs 1.46.5 (30-Dec-2021) 00:32:48.846 Discarding device blocks: 0/4096 done 00:32:48.846 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:48.846 00:32:48.846 Allocating group tables: 0/1 done 00:32:48.846 Writing inode tables: 0/1 done 00:32:48.846 Creating journal (1024 blocks): done 00:32:48.846 Writing superblocks and filesystem accounting information: 0/1 done 00:32:48.846 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:48.846 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:49.104 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:49.104 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:49.104 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:49.104 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:49.104 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:49.104 23:01:33 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2885342 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2885342 ']' 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2885342 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:49.104 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:49.361 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2885342 00:32:49.361 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:49.361 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:49.361 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2885342' 00:32:49.361 killing process with pid 2885342 00:32:49.361 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2885342 00:32:49.361 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2885342 00:32:49.619 23:01:34 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:49.619 00:32:49.619 real 0m10.345s 00:32:49.619 user 0m13.638s 00:32:49.619 sys 0m4.107s 00:32:49.619 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:49.619 23:01:34 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:49.619 ************************************ 00:32:49.619 END TEST bdev_nbd 00:32:49.619 ************************************ 00:32:49.619 23:01:34 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:49.619 23:01:34 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:49.619 23:01:34 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:32:49.619 23:01:34 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:32:49.619 23:01:34 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:49.619 23:01:34 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:49.619 23:01:34 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:49.619 23:01:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:49.619 ************************************ 00:32:49.619 START TEST bdev_fio 00:32:49.619 ************************************ 00:32:49.619 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:49.619 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:49.619 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:49.879 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:49.879 ************************************ 00:32:49.879 START TEST bdev_fio_rw_verify 00:32:49.879 ************************************ 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:49.879 23:01:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:50.138 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:50.138 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:50.138 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:50.138 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:50.138 fio-3.35 00:32:50.138 Starting 4 threads 00:33:05.069 00:33:05.069 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2887385: Mon Jul 15 23:01:47 2024 00:33:05.069 read: IOPS=19.0k, BW=74.1MiB/s (77.7MB/s)(741MiB/10001msec) 00:33:05.069 slat (usec): min=10, max=1464, avg=71.98, stdev=52.32 00:33:05.069 clat (usec): min=11, max=2449, avg=378.58, stdev=313.70 00:33:05.069 lat (usec): min=47, max=2533, avg=450.56, stdev=351.75 00:33:05.069 clat percentiles (usec): 00:33:05.069 | 50.000th=[ 289], 99.000th=[ 1680], 99.900th=[ 1909], 99.990th=[ 2008], 00:33:05.069 | 99.999th=[ 2311] 00:33:05.070 write: IOPS=21.0k, BW=82.0MiB/s (86.0MB/s)(799MiB/9740msec); 0 zone resets 00:33:05.070 slat (usec): min=18, max=485, avg=85.21, stdev=56.44 00:33:05.070 clat (usec): min=25, max=2775, avg=454.99, stdev=377.96 00:33:05.070 lat (usec): min=53, max=3033, avg=540.20, stdev=420.10 00:33:05.070 clat percentiles (usec): 00:33:05.070 | 50.000th=[ 359], 99.000th=[ 2147], 99.900th=[ 2442], 99.990th=[ 2540], 00:33:05.070 | 99.999th=[ 2638] 00:33:05.070 bw ( KiB/s): min=63424, max=101912, per=97.36%, avg=81774.63, stdev=2434.58, samples=76 00:33:05.070 iops : min=15856, max=25478, avg=20443.63, stdev=608.64, samples=76 00:33:05.070 lat (usec) : 20=0.01%, 50=0.29%, 100=7.16%, 250=28.46%, 500=37.90% 00:33:05.070 lat (usec) : 750=14.39%, 1000=5.18% 00:33:05.070 lat (msec) : 2=6.00%, 4=0.61% 00:33:05.070 cpu : usr=99.55%, sys=0.00%, ctx=69, majf=0, minf=315 00:33:05.070 IO depths : 1=10.6%, 2=25.4%, 4=50.9%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:05.070 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:05.070 complete : 0=0.0%, 4=88.8%, 8=11.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:05.070 issued rwts: total=189747,204529,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:05.070 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:05.070 00:33:05.070 Run status group 0 (all jobs): 00:33:05.070 READ: bw=74.1MiB/s (77.7MB/s), 74.1MiB/s-74.1MiB/s (77.7MB/s-77.7MB/s), io=741MiB (777MB), run=10001-10001msec 00:33:05.070 WRITE: bw=82.0MiB/s (86.0MB/s), 82.0MiB/s-82.0MiB/s (86.0MB/s-86.0MB/s), io=799MiB (838MB), run=9740-9740msec 00:33:05.070 00:33:05.070 real 0m13.637s 00:33:05.070 user 0m46.104s 00:33:05.070 sys 0m0.535s 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:05.070 ************************************ 00:33:05.070 END TEST bdev_fio_rw_verify 00:33:05.070 ************************************ 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b33bc2c9-3aed-54ab-813f-eebfea49634e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b33bc2c9-3aed-54ab-813f-eebfea49634e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3db66c88-7b48-5a5c-b149-c7962ef5c284"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3db66c88-7b48-5a5c-b149-c7962ef5c284",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f7ba4b29-efff-558f-8a08-c582d752cf19"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f7ba4b29-efff-558f-8a08-c582d752cf19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "ad7a0f55-e450-5d92-86a7-9611fbf28bf8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ad7a0f55-e450-5d92-86a7-9611fbf28bf8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:05.070 crypto_ram2 00:33:05.070 crypto_ram3 00:33:05.070 crypto_ram4 ]] 00:33:05.070 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b33bc2c9-3aed-54ab-813f-eebfea49634e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b33bc2c9-3aed-54ab-813f-eebfea49634e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3db66c88-7b48-5a5c-b149-c7962ef5c284"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3db66c88-7b48-5a5c-b149-c7962ef5c284",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "f7ba4b29-efff-558f-8a08-c582d752cf19"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "f7ba4b29-efff-558f-8a08-c582d752cf19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "ad7a0f55-e450-5d92-86a7-9611fbf28bf8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ad7a0f55-e450-5d92-86a7-9611fbf28bf8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:05.071 ************************************ 00:33:05.071 START TEST bdev_fio_trim 00:33:05.071 ************************************ 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:05.071 23:01:48 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:05.071 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:05.071 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:05.071 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:05.071 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:05.071 fio-3.35 00:33:05.071 Starting 4 threads 00:33:17.272 00:33:17.272 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2889236: Mon Jul 15 23:02:01 2024 00:33:17.272 write: IOPS=34.1k, BW=133MiB/s (140MB/s)(1332MiB/10001msec); 0 zone resets 00:33:17.272 slat (usec): min=18, max=1425, avg=66.49, stdev=27.48 00:33:17.272 clat (usec): min=43, max=1890, avg=296.36, stdev=149.73 00:33:17.272 lat (usec): min=76, max=1951, avg=362.85, stdev=163.33 00:33:17.272 clat percentiles (usec): 00:33:17.272 | 50.000th=[ 265], 99.000th=[ 685], 99.900th=[ 775], 99.990th=[ 1057], 00:33:17.272 | 99.999th=[ 1631] 00:33:17.272 bw ( KiB/s): min=105392, max=210296, per=100.00%, avg=137371.79, stdev=10174.32, samples=76 00:33:17.272 iops : min=26348, max=52574, avg=34342.95, stdev=2543.58, samples=76 00:33:17.272 trim: IOPS=34.1k, BW=133MiB/s (140MB/s)(1332MiB/10001msec); 0 zone resets 00:33:17.272 slat (usec): min=4, max=111, avg=19.29, stdev=10.40 00:33:17.272 clat (usec): min=42, max=1734, avg=279.16, stdev=148.88 00:33:17.272 lat (usec): min=51, max=1742, avg=298.45, stdev=155.14 00:33:17.272 clat percentiles (usec): 00:33:17.272 | 50.000th=[ 249], 99.000th=[ 750], 99.900th=[ 840], 99.990th=[ 906], 00:33:17.272 | 99.999th=[ 1057] 00:33:17.272 bw ( KiB/s): min=105384, max=210288, per=100.00%, avg=137373.05, stdev=10174.33, samples=76 00:33:17.272 iops : min=26346, max=52572, avg=34343.26, stdev=2543.58, samples=76 00:33:17.272 lat (usec) : 50=0.08%, 100=5.27%, 250=43.25%, 500=41.15%, 750=9.64% 00:33:17.272 lat (usec) : 1000=0.61% 00:33:17.272 lat (msec) : 2=0.01% 00:33:17.272 cpu : usr=99.61%, sys=0.00%, ctx=58, majf=0, minf=114 00:33:17.272 IO depths : 1=6.0%, 2=26.9%, 4=53.7%, 8=13.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:17.272 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:17.272 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:17.272 issued rwts: total=0,341038,341038,0 short=0,0,0,0 dropped=0,0,0,0 00:33:17.272 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:17.272 00:33:17.272 Run status group 0 (all jobs): 00:33:17.272 WRITE: bw=133MiB/s (140MB/s), 133MiB/s-133MiB/s (140MB/s-140MB/s), io=1332MiB (1397MB), run=10001-10001msec 00:33:17.272 TRIM: bw=133MiB/s (140MB/s), 133MiB/s-133MiB/s (140MB/s-140MB/s), io=1332MiB (1397MB), run=10001-10001msec 00:33:17.272 00:33:17.272 real 0m13.675s 00:33:17.272 user 0m46.326s 00:33:17.272 sys 0m0.543s 00:33:17.272 23:02:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:17.272 23:02:02 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:17.272 ************************************ 00:33:17.272 END TEST bdev_fio_trim 00:33:17.272 ************************************ 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:17.533 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:17.533 00:33:17.533 real 0m27.687s 00:33:17.533 user 1m32.612s 00:33:17.533 sys 0m1.293s 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:17.533 ************************************ 00:33:17.533 END TEST bdev_fio 00:33:17.533 ************************************ 00:33:17.533 23:02:02 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:17.533 23:02:02 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:17.533 23:02:02 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:17.533 23:02:02 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:17.533 23:02:02 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:17.533 23:02:02 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:17.533 ************************************ 00:33:17.533 START TEST bdev_verify 00:33:17.533 ************************************ 00:33:17.533 23:02:02 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:17.533 [2024-07-15 23:02:02.363256] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:17.533 [2024-07-15 23:02:02.363325] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2890661 ] 00:33:17.828 [2024-07-15 23:02:02.491132] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:17.828 [2024-07-15 23:02:02.592914] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:17.828 [2024-07-15 23:02:02.592919] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:17.828 [2024-07-15 23:02:02.614328] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:17.828 [2024-07-15 23:02:02.622358] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:17.828 [2024-07-15 23:02:02.630384] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:18.086 [2024-07-15 23:02:02.745596] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:20.621 [2024-07-15 23:02:04.974552] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:20.621 [2024-07-15 23:02:04.974640] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:20.621 [2024-07-15 23:02:04.974655] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.621 [2024-07-15 23:02:04.982572] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:20.621 [2024-07-15 23:02:04.982594] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:20.621 [2024-07-15 23:02:04.982606] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.621 [2024-07-15 23:02:04.990593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:20.621 [2024-07-15 23:02:04.990615] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:20.621 [2024-07-15 23:02:04.990627] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.621 [2024-07-15 23:02:04.998613] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:20.621 [2024-07-15 23:02:04.998633] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:20.621 [2024-07-15 23:02:04.998645] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:20.621 Running I/O for 5 seconds... 00:33:25.892 00:33:25.892 Latency(us) 00:33:25.892 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:25.892 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:25.892 Verification LBA range: start 0x0 length 0x1000 00:33:25.892 crypto_ram : 5.08 471.90 1.84 0.00 0.00 269911.66 3704.21 163213.13 00:33:25.892 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:25.892 Verification LBA range: start 0x1000 length 0x1000 00:33:25.892 crypto_ram : 5.08 378.11 1.48 0.00 0.00 337403.64 16868.40 203332.56 00:33:25.892 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:25.892 Verification LBA range: start 0x0 length 0x1000 00:33:25.892 crypto_ram2 : 5.09 474.88 1.85 0.00 0.00 267843.76 4245.59 151359.67 00:33:25.892 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:25.892 Verification LBA range: start 0x1000 length 0x1000 00:33:25.892 crypto_ram2 : 5.08 378.01 1.48 0.00 0.00 336210.45 18236.10 185096.46 00:33:25.892 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:25.892 Verification LBA range: start 0x0 length 0x1000 00:33:25.892 crypto_ram3 : 5.07 3661.87 14.30 0.00 0.00 34633.65 5556.31 27240.18 00:33:25.893 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:25.893 Verification LBA range: start 0x1000 length 0x1000 00:33:25.893 crypto_ram3 : 5.07 2955.70 11.55 0.00 0.00 42848.34 4729.99 32141.13 00:33:25.893 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:25.893 Verification LBA range: start 0x0 length 0x1000 00:33:25.893 crypto_ram4 : 5.07 3661.19 14.30 0.00 0.00 34547.10 5955.23 26784.28 00:33:25.893 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:25.893 Verification LBA range: start 0x1000 length 0x1000 00:33:25.893 crypto_ram4 : 5.07 2953.50 11.54 0.00 0.00 42749.22 5185.89 31685.23 00:33:25.893 =================================================================================================================== 00:33:25.893 Total : 14935.15 58.34 0.00 0.00 68059.02 3704.21 203332.56 00:33:25.893 00:33:25.893 real 0m8.288s 00:33:25.893 user 0m15.674s 00:33:25.893 sys 0m0.401s 00:33:25.893 23:02:10 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:25.893 23:02:10 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:25.893 ************************************ 00:33:25.893 END TEST bdev_verify 00:33:25.893 ************************************ 00:33:25.893 23:02:10 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:25.893 23:02:10 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:25.893 23:02:10 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:25.893 23:02:10 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:25.893 23:02:10 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:25.893 ************************************ 00:33:25.893 START TEST bdev_verify_big_io 00:33:25.893 ************************************ 00:33:25.893 23:02:10 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:25.893 [2024-07-15 23:02:10.739072] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:25.893 [2024-07-15 23:02:10.739137] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2891720 ] 00:33:26.152 [2024-07-15 23:02:10.868379] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:26.152 [2024-07-15 23:02:10.967959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:26.152 [2024-07-15 23:02:10.967965] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:26.152 [2024-07-15 23:02:10.989352] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:26.152 [2024-07-15 23:02:10.997384] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:26.152 [2024-07-15 23:02:11.005410] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:26.411 [2024-07-15 23:02:11.115735] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:28.963 [2024-07-15 23:02:13.341628] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:28.963 [2024-07-15 23:02:13.341722] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:28.963 [2024-07-15 23:02:13.341737] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:28.963 [2024-07-15 23:02:13.349646] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:28.963 [2024-07-15 23:02:13.349669] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:28.963 [2024-07-15 23:02:13.349680] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:28.963 [2024-07-15 23:02:13.357670] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:28.963 [2024-07-15 23:02:13.357690] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:28.963 [2024-07-15 23:02:13.357701] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:28.963 [2024-07-15 23:02:13.365690] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:28.963 [2024-07-15 23:02:13.365710] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:28.963 [2024-07-15 23:02:13.365721] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:28.964 Running I/O for 5 seconds... 00:33:29.530 [2024-07-15 23:02:14.383497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:29.530 [2024-07-15 23:02:14.384060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:29.530 [2024-07-15 23:02:14.384276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:29.530 [2024-07-15 23:02:14.384357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:29.530 [2024-07-15 23:02:14.384419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:29.530 [2024-07-15 23:02:14.384899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.386614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.386683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.386737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.386790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.387406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.387481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.387550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.387617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.388084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.389541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.389629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.389686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.389754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.390470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.390551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.390605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.390658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.391087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.392486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.392562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.392617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.392671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.393251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.393327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.393384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.393437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.393897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.395575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.530 [2024-07-15 23:02:14.395656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.395713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.395766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.396391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.396460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.396513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.396566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.397051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.398300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.398374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.398427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.398479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.399036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.399108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.399164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.399216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.399577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.400868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.400941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.401014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.401068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.401670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.401731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.401784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.401835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.402226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.403470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.403534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.403593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.403671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.404169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.404229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.404282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.404334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.406254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.406324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.406376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.406429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.407003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.407071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.407131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.407189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.408611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.408680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.408731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.408783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.409322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.409382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.409459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.409514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.411631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.411695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.411747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.411799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.412349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.412429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.412488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.412540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.414948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.416734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.416806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.416864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.416923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.417421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.417485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.417538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.417590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.419202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.419265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.419323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.419380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.419873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.419943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.420001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.420054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.421664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.421732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.421784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.421836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.422401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.531 [2024-07-15 23:02:14.422462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.422515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.422575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.424125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.424193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.424245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.424297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.424827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.424887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.424987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.425044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.426707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.426775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.426827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.426879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.427470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.427538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.427598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.427652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.429089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.429152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.429204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.429256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.429873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.429941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.430012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.430066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.431732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.431795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.431850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.431905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.432405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.432467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.432525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.432577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.434169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.434238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.434299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.434353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.435051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.435120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.435178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.435234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.436808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.436894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.436969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.532 [2024-07-15 23:02:14.437032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.437588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.437666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.437724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.437797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.439276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.439343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.439395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.791 [2024-07-15 23:02:14.439450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.440116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.440184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.440237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.440290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.441947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.442032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.442088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.442140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.442655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.442719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.442771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.442823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.444387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.444451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.444516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.444569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.445206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.445267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.445320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.445372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.446983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.447051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.447109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.447161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.447741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.447804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.447856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.447909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.449503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.449587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.449640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.449692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.450252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.450313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.450366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.450426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.451895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.451973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.452025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.452078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.452617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.452676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.452729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.452789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.454394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.454457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.454510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.454562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.455056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.455116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.455175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.455234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.456707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.456774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.456831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.456884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.457416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.457488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.457543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.457599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.459386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.459449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.459503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.459555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.460053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.460115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.460174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.460226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.461820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.461883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.461944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.461997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.462542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.462608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.462667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.462718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.464532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.464594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.464646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.464698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.465192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.465258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.465311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.465362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.466885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.468723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.470585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.472433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.473002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.473501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.475483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.477363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.480796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.482642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.484311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.484799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.487095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.488949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.490795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.792 [2024-07-15 23:02:14.492137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.494508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.495014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.497068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.499056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.501375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.503433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.505489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.507559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.511294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.513142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.514986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.516353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.518645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.520507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.521845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.522344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.525652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.527211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.529032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.530874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.531860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.532446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.534256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.536106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.539338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.541196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.542616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.543114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.545351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.547208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.549068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.550594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.552590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.553210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.555022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.556871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.558733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.560558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.562403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.564235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.567580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.569444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.571284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.572718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.575086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.576941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.577444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.577971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.580785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.582605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.584441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.586294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.587325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.588898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.590712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.592551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.595924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.597786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.598293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.598887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.601118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.602961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.604339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.606150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.608024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.609565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.611386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.613216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.615160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.616990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.618839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.620744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.624536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.626479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.628266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.630321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.632666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.634280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.634775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.636206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.639133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.640956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.642790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.644628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.645719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.647598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.649416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.651282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.654529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.656101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.656597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.658033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.660381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.662114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.663178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.664979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.667035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.667937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.669225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.793 [2024-07-15 23:02:14.670652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.671693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.672205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.672703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.673204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.675389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.675891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.676393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.676894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.677982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.678484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.678986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.679500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.681864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.682374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.682869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.683371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.684536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.685051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.685554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.686053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.688572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.689091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.689585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.690102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.691208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.691709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.692211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.692704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.695068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.695580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.696111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.696621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.697733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:29.794 [2024-07-15 23:02:14.698274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.698802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.699310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.701833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.702354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.702857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.703381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.704487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.705017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.705516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.706027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.708241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.708747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.709265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.709757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.710750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.711260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.711754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.712255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.714426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.714949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.715445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.715950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.717079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.717583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.718089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.718581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.720955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.721463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.721968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.722458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.723564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.724071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.724564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.725065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.728100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.730016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.731711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.732223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.734430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.735431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.736682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.737175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.739407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.741101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.743176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.744351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.746298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.748128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.749968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.752022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.755629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.756143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.756732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.758550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.761123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.762280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.764092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.765935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.768945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.770767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.772596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.774647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.776963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.778817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.780910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.781413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.785024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.786356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.788252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.790095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.791897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.792401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.793913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.795739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.799049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.800879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.802961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.803462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.806132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.808017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.810118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.811397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.814121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.814622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.816047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.056 [2024-07-15 23:02:14.817837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.820400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.821735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.823539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.825379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.828943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.830999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.833088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.834760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.837168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.839228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.840790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.841291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.844854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.846045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.847870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.847921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.849761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.850106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.850713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.851326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.853143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.854995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.856487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.856550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.856602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.856653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.857030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.859194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.859260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.859313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.859383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.861964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.863460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.863522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.863574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.863634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.863978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.864161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.864241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.864296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.864348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.865953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.866822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.868430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.868508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.868563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.868614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.869009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.869186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.869248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.869301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.869354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.870992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.871848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.873342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.873405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.873457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.873508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.873956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.874137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.874196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.874249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.874319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.875957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.876804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.057 [2024-07-15 23:02:14.878229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.878291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.878343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.878394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.878908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.879091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.879158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.879217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.879270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.880869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.880937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.880992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.881054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.881391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.881570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.881635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.881690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.881742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.883207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.883270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.883329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.883381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.883920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.884106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.884163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.884215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.884267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.885793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.885865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.885917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.885982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.886424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.886605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.886662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.886713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.886764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.888293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.888361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.888414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.888485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.888992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.889168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.889231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.889284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.889336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.890752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.890814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.890866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.890918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.891333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.891508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.891564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.891616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.891667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.893145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.893232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.893285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.893337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.893820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.894007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.894066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.894118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.894175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.895591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.895653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.895705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.895770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.896176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.896359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.896423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.896476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.896533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.897961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.898023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.898075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.898128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.898657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.898833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.898899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.898960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.899012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.900535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.900607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.900664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.900716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.901059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.901240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.901311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.901363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.901415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.902938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.903001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.903055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.903108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.903656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.903833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.058 [2024-07-15 23:02:14.903890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.903949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.904000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.905534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.905597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.905650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.905706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.906052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.906228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.906285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.906336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.906388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.907947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.908945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.910460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.910523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.910575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.910633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.910981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.911158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.911214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.911266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.911317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.912856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.912944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.912999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.913055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.913399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.913572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.913630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.913688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.913741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.915147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.915215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.915267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.915323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.915739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.915914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.915979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.916031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.916089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.917774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.917836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.917889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.917950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.918285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.918458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.918525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.918583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.918639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.920183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.920244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.920297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.920349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.920750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.920938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.921004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.921063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.921118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.922956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.923803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.925397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.925460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.925511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.925567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.925981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.926160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.926225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.926281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.926333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.927933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.927997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.928049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.928102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.928434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.928609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.928670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.928722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.928773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.930367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.930433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.930485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.930543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.930875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.931060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.059 [2024-07-15 23:02:14.931118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.931170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.931221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.932780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.932842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.932906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.932969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.933334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.933509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.933565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.933618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.933669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.935213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.935276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.937319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.937582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.937760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.937822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.937874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.937934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.939777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.941590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.943424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.945469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.945839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.946023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.947931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.949943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.951907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.955618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.957757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.060 [2024-07-15 23:02:14.958949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.960853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.961278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.963553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.964066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.964558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.966366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.969732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.971610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.973689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.974852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.975298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.976457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.978292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.979262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.981317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.985081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.987101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.989062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.990749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.991147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.993026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.993530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.994025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.994515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.996697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.997207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.997705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.998206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.998710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.999323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:14.999818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.000316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.000810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.003016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.003541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.004044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.004535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.005014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.005621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.006122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.006616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.007112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.009662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.010173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.010663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.011164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.011665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.012286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.012789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.013287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.013779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.015898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.016409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.016917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.017420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.017887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.018504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.019012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.019507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.020002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.022291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.022792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.023292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.023780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.024207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.024833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.025343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.025847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.026360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.028432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.321 [2024-07-15 23:02:15.028938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.029434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.029937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.030469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.031092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.031592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.032095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.032588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.034908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.035417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.035924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.036426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.036958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.037570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.038073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.038566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.039072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.041204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.041725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.042233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.042733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.043255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.043866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.044376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.044871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.045901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.049358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.049866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.050417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.052070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.052547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.053738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.054241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.055327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.056721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.060520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.062475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.063835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.065883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.066243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.068122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.068639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.069801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.071616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.074920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.076772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.078622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.079403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.079893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.081970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.084024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.086074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.088046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.091513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.092025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.092832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.094643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.095024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.096996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.098391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.100204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.102053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.105120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.106935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.108803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.110816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.111176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.113117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.115013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.117039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.117545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.120975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.122501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.124352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.126206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.126585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.128162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.128670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.130234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.132055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.135364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.137207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.139042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.139538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.140108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.142274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.144239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.146086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.147760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.150949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.151473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.152616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.154445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.154821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.156797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.158185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.160004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.161852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.165242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.167298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.322 [2024-07-15 23:02:15.169343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.171404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.171747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.173876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.175931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.177922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.178418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.181756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.183137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.184951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.186778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.187153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.188436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.188940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.190738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.192635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.195894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.197741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.199697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.200199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.200729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.202683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.204537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.206379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.207771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.210689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.211198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.212632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.214448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.214824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.216785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.218344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.220162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.222012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.323 [2024-07-15 23:02:15.225720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.227607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.229477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.230858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.231244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.233248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.235112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.236117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.236611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.240118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.241877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.243739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.245738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.246088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.246700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.247470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.249216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.251017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.254260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.256119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.257285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.257782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.258131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.260180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.262236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.264285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.266152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.268119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.268815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.270631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.272487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.272879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.274375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.276200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.278043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.279867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.283183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.285043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.286877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.288357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.288762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.290730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.292638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.293156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.293707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.296600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.298450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.300277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.302122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.302516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.303134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.304527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.306332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.308178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.311504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.313364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.314006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.314500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.314841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.316921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.318748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.320433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.322491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.324340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.325337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.327155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.327216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.327625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.329593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.331049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.332780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.334622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.338039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.338114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.338172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.338224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.338636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.340621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.340687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.340740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.340791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.342385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.342451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.342511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.585 [2024-07-15 23:02:15.342570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.342904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.343094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.343153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.343206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.343258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.344808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.344877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.344943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.344996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.345409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.345590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.345653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.345704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.345756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.347312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.347382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.347438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.347493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.347842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.348037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.348095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.348147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.348199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.349812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.349888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.349950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.350003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.350374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.350553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.350615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.350667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.350726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.352249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.352316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.352375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.352427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.352803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.352996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.353054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.353106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.353159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.354757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.354825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.354877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.354938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.355356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.355534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.355590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.355642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.355702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.357214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.357297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.357354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.357406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.357798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.357993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.358051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.358105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.358160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.359781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.359848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.359900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.359959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.360338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.360517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.360578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.360637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.360691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.362217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.362288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.362340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.362392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.362771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.362960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.363019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.363072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.363126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.364736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.364803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.364855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.364906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.365355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.365540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.365596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.365648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.365699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.367272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.367334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.367388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.367442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.367990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.368175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.368245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.368312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.368367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.369889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.369960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.370026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.370079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.586 [2024-07-15 23:02:15.370525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.370704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.370766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.370826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.370877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.372517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.372580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.372632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.372683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.373159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.373339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.373396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.373447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.373508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.375162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.375225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.375279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.375343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.375833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.376023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.376081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.376135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.376187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.377894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.377976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.378030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.378081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.378511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.378692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.378750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.378802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.378876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.381983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.382050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.383606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.383670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.383723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.383775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.384276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.384459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.384516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.384568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.384621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.386344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.386422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.386477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.386529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.386971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.387156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.387214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.387265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.387329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.389416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.389481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.389533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.389589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.390006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.390192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.390249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.390314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.390381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.392035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.392099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.392152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.392204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.392736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.392915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.392986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.393038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.393091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.394780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.394862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.394936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.394989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.395398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.395576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.395633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.395686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.395737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.397728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.397791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.397843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.397895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.398359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.398539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.398596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.398648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.587 [2024-07-15 23:02:15.398713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.400616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.400695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.400750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.400802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.401227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.401406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.401476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.401529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.401580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.403190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.403254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.403306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.403388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.403883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.404072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.404135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.404189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.404241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.405884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.405965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.406018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.406070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.406480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.406661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.406719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.406772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.406823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.409136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.409211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.409294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.409355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.409837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.410028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.410103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.410168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.410222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.411889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.411968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.412024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.412076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.412617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.412794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.412851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.412904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.412963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.414660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.414735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.414789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.414841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.415261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.415440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.415502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.415555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.415608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.417836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.417918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.417993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.418048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.418465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.418645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.418727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.418787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.418839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.420451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.420516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.420569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.420644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.421206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.421383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.421440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.421493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.421545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.423465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.423529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.423581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.424081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.424547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.424728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.424785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.424850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.424904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.426604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.427124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.427648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.428146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.428625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.428802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.429309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.429804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.430304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.432611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.433143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.588 [2024-07-15 23:02:15.433669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.434165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.434582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.436100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.437936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.439783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.441826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.445470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.445979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.446486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.448331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.448670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.450013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.451386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.452634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.453134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.455114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.455646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.457434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.459507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.459906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.461989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.464081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.465684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.466207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.469835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.470974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.472281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.474125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.474465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.475844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.476352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.477728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.479533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.482906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.484757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.486817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.487329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.487871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.589 [2024-07-15 23:02:15.489880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.491774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.493864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.495003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.497251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.497762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.499741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.501792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.502154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.503897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.505952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.507995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.510074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.513938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.515781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.517852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.519019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.519379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.521342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.523422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.524372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.524866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.528425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.530101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.531896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.533793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.534139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.534754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.535420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.537240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.539084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.542433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.544523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.545322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.545817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.546164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.548142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.550133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.552066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.553816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.555818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.556579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.558395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.560232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.560572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.561840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.563679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.565517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.567590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.570862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.572743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.574787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.576494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.576885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.578886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.580891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.581391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.582002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.584576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.586403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.588231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.590301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.590770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.591394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.593153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.595017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.597012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.600427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.602328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.602821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.603539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.603948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.605895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.607979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.609116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.610930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.612973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.614701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.616514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.618453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.618798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.620611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.622457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.624442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.626369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.630053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.848 [2024-07-15 23:02:15.632134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.633329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.635159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.635523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.637722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.638825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.639326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.640794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.643840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.645656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.647538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.649584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.650071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.650686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.652589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.654433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.656506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.660044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.661255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.661751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.663142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.663540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.665491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.667569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.668993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.670796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.672939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.674773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.676621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.678695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.679164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.681039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.682876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.684950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.686051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.689499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.691578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.692940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.694754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.695143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.697340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.697840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.698341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.700211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.703577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.705428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.707500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.708738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.709233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.710487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.712290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.714118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.716197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.719937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.720445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.720948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.722992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.723332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.725542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.727058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.729102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.731145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.733705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.735529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.737374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.739442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.739930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.741853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.743700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.745769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.746488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.749956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.751884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:30.849 [2024-07-15 23:02:15.753774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.755753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.756114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.757795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.758320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.759496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.761324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.764694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.766534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.768614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.769122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.769659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.771830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.773904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.775983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.777536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.780439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.780955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.782054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.783856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.784245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.786443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.787548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.789360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.791195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.793248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.793754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.794260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.794980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.108 [2024-07-15 23:02:15.795377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.796874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.798750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.800733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.801242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.804851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.806033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.807831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.809646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.810157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.810769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.811281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.811773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.812292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.814636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.815152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.815648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.816174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.816658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.817286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.817791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.818292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.818788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.821479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.821997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.822492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.822990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.823443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.824067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.824567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.825082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.825578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.827953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.828461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.828968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.829465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.830039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.830651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.831163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.831659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.832159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.834679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.835203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.835698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.835757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.836307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.836922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.837431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.837940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.838435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.840715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.840787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.840840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.840894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.841399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.843026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.843095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.843151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.843204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.845913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.847867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.847952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.848029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.848108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.848445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.848639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.848698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.848751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.848804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.850641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.850713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.850778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.850844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.851254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.851438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.851520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.851574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.851627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.853345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.853445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.853499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.853564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.853986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.854172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.854232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.854290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.854346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.856033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.856097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.856150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.856213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.856701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.109 [2024-07-15 23:02:15.856883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.856948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.857001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.857052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.858782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.858856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.858909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.858967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.859416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.859604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.859661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.859713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.859765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.861665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.861739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.861807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.861862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.862342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.862526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.862592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.862644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.862696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.864404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.864480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.864547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.864614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.865076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.865258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.865320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.865373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.865430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.867028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.867092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.867156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.867227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.867676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.867859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.867944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.868004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.868061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.869817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.869881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.869940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.870016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.870499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.870684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.870753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.870836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.870903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.872576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.872640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.872694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.872747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.873237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.873419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.873478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.873542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.873608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.875178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.875242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.875294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.875345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.875765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.875956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.876019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.876071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.876135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.877751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.877825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.877878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.877943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.878360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.878542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.878600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.878652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.878703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.880293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.880356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.880408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.880460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.880997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.881178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.881259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.881312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.881365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.882958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.883884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.885525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.885599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.885664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.885718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.886113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.110 [2024-07-15 23:02:15.886296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.886357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.886413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.886465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.888235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.888298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.888354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.888406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.888923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.889115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.889174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.889227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.889287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.891265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.891332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.891384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.891435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.891771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.891964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.892035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.892091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.892143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.893779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.893851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.893903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.893962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.894298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.894476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.894532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.894593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.894646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.896405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.896475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.896527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.896578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.896914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.897107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.897174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.897226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.897277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.898879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.898951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.899005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.899064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.899400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.899578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.899636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.899707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.899760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.901401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.901473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.901528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.901579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.901914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.902100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.902157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.902221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.902276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.903857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.903920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.903979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.904031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.904491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.904674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.904731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.904783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.904835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.906519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.906583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.906636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.906687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.907032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.907207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.907269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.907322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.907374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.908830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.908893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.908961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.909018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.909520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.909696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.909772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.909826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.909878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.911456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.911528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.911580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.911631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.912082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.912269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.912329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.912381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.912437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.913938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.914004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.914056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.914129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.914635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.914811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.111 [2024-07-15 23:02:15.914867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.914920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.914980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.965082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.967120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.967188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.968586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.970562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.971222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.973044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.975109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.976475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.978290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.980371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.982455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.985906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.988034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.989678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.991502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.994050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.995749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.996247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.997113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:15.999577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:16.001412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:16.003487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:16.005567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:16.006648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:16.008527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:16.010479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.112 [2024-07-15 23:02:16.012602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.016297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.017357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.017847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.019336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.021952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.023902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.025495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.027312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.029496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.031315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.033391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.035472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.037843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.039946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.042016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.042768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.046404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.048186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.049880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.051734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.053972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.054471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.055220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.057029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.060440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.062507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.064587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.065092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.067420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.069351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.071455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.073081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.076135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.076637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.077628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.079449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.081972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.083012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.084805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.086869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.090694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.092759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.094833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.096175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.098665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.100754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.102107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.102603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.106205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.107379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.109182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.111248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.112192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.112693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.114723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.116788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.120328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.122422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.372 [2024-07-15 23:02:16.123462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.123954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.126213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.128295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.130350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.131750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.133697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.134318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.136128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.138206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.139554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.141364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.143424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.145488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.149033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.151119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.153127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.154614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.157173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.159240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.159736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.160232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.162523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.164335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.166410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.168480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.169607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.171241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.173054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.175102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.177320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.177824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.178325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.178822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.180354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.182167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.184111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.184983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.186934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.188136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.189947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.191991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.194433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.196507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.197808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.198324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.201480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.201985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.202482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.202990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.204130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.204640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.205141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.205634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.207879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.208404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.208903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.209402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.210471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.210979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.211471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.211968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.214494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.215004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.215498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.215998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.217135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.217641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.218140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.218628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.221100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.221606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.222102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.222596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.223647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.224154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.224649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.225148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.227459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.227966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.228462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.228969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.230012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.230516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.231037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.231529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.233949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.234454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.234959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.235453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.236551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.237063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.237558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.238054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.240515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.241026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.241520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.242018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.243041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.243543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.373 [2024-07-15 23:02:16.244038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.244533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.246734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.247268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.247763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.248262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.249390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.249911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.250438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.250933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.254586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.255720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.256216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.257699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.260270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.261276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.263095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.265164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.265214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.265700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.267697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.269751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.270254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.270744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.273336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.274238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.276350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.277151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.374 [2024-07-15 23:02:16.277530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.278824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.280868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.282918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.283004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.284189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.286276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.286350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.288426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.288957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.291241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.291318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.291815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.291878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.292399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.294475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.294543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.296683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.297221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.300448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.300531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.301041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.301118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.301770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.303588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.303654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.305743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.306094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.309331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.309408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.310734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.310808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.311466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.313098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.313166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.315213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.315558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.318774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.318851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.320936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.321001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:31.634 [2024-07-15 23:02:16.321727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.322229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.322298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.324109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.324445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.325726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.327558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.327617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.327670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.328183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.329996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.330056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.330108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.330655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.331977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.332039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.332097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.332152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.332654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.332712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.332763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.332815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.333301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.334525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.334597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.334649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.334701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.335291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.335351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.335405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.335457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.335952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.337109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.337171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.337223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.634 [2024-07-15 23:02:16.337275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.337771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.337831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.337883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.337943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.338274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.339456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.339526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.339581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.339633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.340219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.340280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.340332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.340384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.340834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.342851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.343191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.344372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.344435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.344487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.344550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.345273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.345333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.345395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.345447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.345890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.347109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.347181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.347233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.347285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.347857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.347922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.347982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.348034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.348416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.349575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.349639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.349692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.349763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.350453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.350513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.350567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.350620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.350958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.352139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.352202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.352258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.352317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.352969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.353028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.353080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.353146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.353475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.354664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.354729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.354782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.354834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.355480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.355556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.355611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.355669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.356043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.357202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.357270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.357324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.357376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.357876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.357945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.357998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.358049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.358381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.359548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.359610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.359665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.359717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.360417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.360477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.360530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.360581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.360956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.362967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.363297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.367115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.367180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.367237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.367290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.635 [2024-07-15 23:02:16.367787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.367855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.367908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.367968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.368366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.373454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.373520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.373572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.373624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.374257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.374317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.374375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.374426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.374795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.379964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.380399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.383811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.383881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.383947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.384000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.384629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.384695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.384748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.384805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.385207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.389118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.389188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.389240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.389292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.389828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.389887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.389946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.390005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.390336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.395087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.395153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.395205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.395257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.395871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.395940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.395994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.396047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.396578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.401812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.401880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.401939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.401991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.402536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.402593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.402645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.402707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.403042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.406760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.406824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.406881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.406947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.407443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.407501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.407553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.407609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.408092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.413195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.413260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.413313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.413365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.414089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.414151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.414204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.414255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.414651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.418711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.418775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.418826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.418886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.419390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.419450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.419502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.419553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.420016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.423394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.423459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.423511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.423568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.424213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.424273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.424325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.424377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.424741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.428689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.428753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.428811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.428863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.429423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.636 [2024-07-15 23:02:16.429483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.429545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.431620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.432064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.436956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.437026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.437079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.437567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.438070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.438142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.438196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.438252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.438589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.443775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.443840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.445913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.445978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.446637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.446698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.447197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.447469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.454185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.454257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.456333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.456391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.457019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.457520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.457576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.459517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.459868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.462372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.462441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.464098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.464156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.464692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.466780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.466840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.467906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.468418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.471643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.471713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.473158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.475220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.475721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.477797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.477857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.478745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.479198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.480360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.482419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.482480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.483921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.484428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.486485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.486544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.488612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.489069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.492385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.492658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.494751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.494951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.496206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.498279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.500362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.500957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.501560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.503607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.505473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.506850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.507251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.509349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.510933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.512198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.513139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.515044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.516750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.518822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.519763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.520142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.521761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.523328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.525136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.525779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.527830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.528339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.528830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.530770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.531138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.532987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.533489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.533985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.534483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.535591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.536108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.536609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.537114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.637 [2024-07-15 23:02:16.537647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.539571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.540120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.540619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.541116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.542184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.542687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.543189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.543681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.544222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.546043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.546557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.547086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.547596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.548624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.549139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.549640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.550160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.550624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.552299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.552802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.553301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.553798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.554895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.555404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.896 [2024-07-15 23:02:16.555919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.556419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.556870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.558781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.559297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.559791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.560310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.561429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.561936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.562431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.562934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.563446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.691911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.693424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.693482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.694391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.698811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.699197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.699587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.699979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.701962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.702294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.702447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.704164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.704219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.705940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.707041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.708197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.708468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.708485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.713142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.713538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.713924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.715912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.716250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.718099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.719849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.721175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.722140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.722407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.722423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.727097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.727489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.727876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.729851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.730194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.732041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.733789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.735128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.736032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.736300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.736317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.740914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.741318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.741704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.743688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.744045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.745891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.747643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.749021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.749954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.750222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.750239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.754884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.755288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.755680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.757663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.757993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.759842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.761606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.762819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.763878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.764158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.764176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.769002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.769399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.769864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.771740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.772015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.773847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.775592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.776729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.777861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.778136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.778153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.783059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.783454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.783975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.785777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.786048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.787886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.789627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.790764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.791890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.792166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.792183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.793475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.793872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.795654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.797039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.797308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.799101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.799500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.801249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.802936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.803204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:31.897 [2024-07-15 23:02:16.803221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.804612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.805017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.806507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.807644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.807915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.808417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.810413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.811848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.813558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.813830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.813847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.816721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.818265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.818815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.820598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.820866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.821869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.822271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.822657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.823236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.823505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.823522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.827861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.828469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.828859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.829249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.829524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.831265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.832945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.833603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.835166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.835519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.835535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.839847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.841361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.841779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.843685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.843968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.845118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.845514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.845901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.846398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.846668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.846684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.850390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.850791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.851189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.851579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.851997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.852497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.852892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.853290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.853685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.854055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.854072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.159 [2024-07-15 23:02:16.856476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.856879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.857282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.857670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.858049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.858546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.858956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.859350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.859740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.860039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.860057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.862282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.862699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.863096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.863488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.863825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.864324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.864718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.865115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.865510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.865844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.865861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.868103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.868507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.868911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.869302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.869576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.869739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.870152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.870209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.870615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.871096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.871114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.873171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.873568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.873962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.874033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.874410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.874556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.874965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.875017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.875404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.875738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.875760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.877988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.878059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.878446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.878489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.878828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.879001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.879397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.879796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.880191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.880710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.880727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.882944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.883000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.883387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.883442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.883913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.884066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.884464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.884852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.885245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.885569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.885585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.887954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.888009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.888411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.888460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.888772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.889285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.889682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.889733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.890127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.890462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.890479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.892693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.892758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.893160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.893212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.893633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.893796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.894198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.894261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.894646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.894997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.895014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.897034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.897448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.897497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.897886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.898294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.898790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.898843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.899239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.899633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.899975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.899993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.902187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.902266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.902656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.903050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.903097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.903593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.903610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.903761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.904165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.904562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.904612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.905001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.905406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.905422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.905436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.907965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.908022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.908426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.908469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.160 [2024-07-15 23:02:16.908939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.908956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.909107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.910197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.910248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.911270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.911610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.911627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.911642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.913385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.913441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.915034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.915081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.915421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.915437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.915591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.916709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.916761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.917311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.917580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.917596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.917610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.920763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.920822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.922658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.922713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.923104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.923121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.923276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.924894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.924958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.926941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.927419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.927435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.927450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.931935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.932002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.932390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.932435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.932701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.932717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.932873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.934600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.934651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.935047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.935546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.935565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.935586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.939844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.939902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.940316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.940360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.940626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.940643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.940795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.942143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.942194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.942973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.943392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.943408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.943423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.947343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.947399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.949387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.949434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.949734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.949750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.949904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.950713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.950763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.951171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.951590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.951614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.951636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.955589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.955651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.957473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.957531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.958034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.958197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.958599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.958651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.959813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.960103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.960127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.960148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.965340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.965404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.966934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.966986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.967009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.967273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.967295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.967458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.968875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.968934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.161 [2024-07-15 23:02:16.969768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.969786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.973490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.975241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.975291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.975332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.975655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.975672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.975829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.977841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.977894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.979710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.980064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.980081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.983870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.984206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.984223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.161 [2024-07-15 23:02:16.988409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.988476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.988518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.988559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.988820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.988837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.989002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.989054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.989095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.989136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.989397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.989414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.993247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.993313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.993365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.993406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.993869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.993892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.994055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.994104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.994146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.994187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.994447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.994464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.998836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.998890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.998937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.998979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:16.999844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.004957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.005003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.005044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.005305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.005326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.009332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.009385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.009425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.009474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.009881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.009898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.010056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.010102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.010143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.010183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.010517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.010534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.014960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.015820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.016116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.016133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.020898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.021167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.021184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.025248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.025304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.025347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.025389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.025789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.025805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.025962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.026011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.026072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.026115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.026631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.026649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.030752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.030812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.030858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.030899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.031743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.162 [2024-07-15 23:02:17.035546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.035599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.035639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.035680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.036734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.039743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.039795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.039836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.039877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.040802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.044917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.044986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.045763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.046285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.046304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.050423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.050480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.050521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.050562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.050824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.050840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.051003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.051049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.051091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.051133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.051451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.051467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.055765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.055819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.055860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.055901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.056309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.056325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.056476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.056522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.056564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.057207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.057471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.057487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.061568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.061625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.061667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.061708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.062006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.062023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.062175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.063507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.163 [2024-07-15 23:02:17.063555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.065183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.065451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.065467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.068518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.069743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.069792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.071155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.071420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.071437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.071592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.073257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.073306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.075001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.075392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.075409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.079700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.081478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.081526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.081910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.082189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.082206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.082355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.082401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.082448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.083601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.083874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.083890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.088493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.090024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.090073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.091428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.091699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.091716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.091868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.091915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.091963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.092004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.092269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.092286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.095396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.097229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.097290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.099274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.099543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.099560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.099713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.101458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.101506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.101906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.102182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.424 [2024-07-15 23:02:17.102200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.106386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.106891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.106945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.106993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.107343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.107360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.107510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.107947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.107995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.108036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.108303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.108319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.113461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.113520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.113561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.114922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.115193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.115209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.117100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.117154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.117207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.119190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.119580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.119597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.122775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.122827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.124568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.124615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.124880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.124897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.125054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.125101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.126140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.126189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.126457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.126473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.131462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.131518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.131904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.131953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.132359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.132376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.133564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.133618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.134987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.135034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.135296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.135312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.140061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.140117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.141497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.141543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.141807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.141823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.143911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.143972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.144358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.144400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.144681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.144697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.149002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.149059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.150444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.150488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.150800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.150816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.152327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.152380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.154139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.154185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.154450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.154466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.158508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.158565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.159917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.425 [2024-07-15 23:02:17.159971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.160238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.160256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.162197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.162253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.164008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.164054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.164416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.164433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.170209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.170267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.170654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.170700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.171058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.171076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.172008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.172064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.173417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.173465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.173738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.173755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.178755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.178814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.180209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.180255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.180521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.180538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.182634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.182694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.183089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.183132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.183411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.183428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.187756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.187815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.189194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.189239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.189537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.189554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.191011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.191066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.191107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.192565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.192835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.192852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.197253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.197310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.198664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.200540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.200812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.200833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.200995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.202554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.202603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.203530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.203795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.203812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.208695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.208834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.426 [2024-07-15 23:02:17.208858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.426 [2024-07-15 23:02:17.209376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.426 [2024-07-15 23:02:17.209434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.426 [2024-07-15 23:02:17.211331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:32.426 [2024-07-15 23:02:17.211657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.216383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.218311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.219768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.221137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.221408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.221424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.221479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.223467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.223860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.224253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.224705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.231024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.231431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.233421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.235052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.235383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.235404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.237252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.238704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.239099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.239488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.239813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.245909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.246659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.248254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.250213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.250546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.426 [2024-07-15 23:02:17.250563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.252409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.253559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.253955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.254343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.254618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.260336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.261292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.262673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.264622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.264903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.264920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.266760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.267705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.268111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.268499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.268771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.274018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.275140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.277135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.278991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.279398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.279415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.279905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.280307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.281456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.283022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.283297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.287719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.289692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.290090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.290477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.290903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.290920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.292546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.294111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.295238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.295808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.296089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.298985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.300983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.302767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.303313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.303657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.303673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.305003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.306872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.307266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.307653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.308062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.311679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.312897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.313296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.313685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.313968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.313985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.315597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.317444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.317948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.319101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.319377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.323378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.323967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.325710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.327375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.327718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.327734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.328239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.328633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.329039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.329434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.427 [2024-07-15 23:02:17.329827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.332416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.332816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.333212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.333597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.334104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.334503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.334892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.335305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.335638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.337859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.338275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.338669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.339068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.339573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.340086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.340482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.340887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.341287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.688 [2024-07-15 23:02:17.341739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.345643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.346054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.346449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.346847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.347283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.347778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.348190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.348830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.350520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.350907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.354490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.354888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.355287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.355675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.356026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.356523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.356920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.357337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.357726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.358062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.360289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.360696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.361104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.361498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.361912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.362418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.362814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.363216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.363610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.364139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.366337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.366733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.367133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.367530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.367923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.368433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.368827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.369223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.369617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.369970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.372469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.372875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.373292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.373680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.373993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.374149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.374545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.374594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.375008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.375505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.377602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.378013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.378410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.378802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.379146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.379303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.379716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.379764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.380159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.380515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.382563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.382969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.383016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.383409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.383771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.383940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.384334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.384723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.385118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.385444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.387402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.387800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.387850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.388250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.388640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.388795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.389195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.389582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.389981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.390329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.394045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.395617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.395670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.396219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.396497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.398108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.399906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.399964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.400354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.400757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.402585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.404148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.404199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.405099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.405436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.405591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.407417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.689 [2024-07-15 23:02:17.407463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.407855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.408198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.411138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.412106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.413459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.413507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.413778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.414498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.414555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.416231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.416619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.416891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.421278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.422677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.422732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.423603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.423878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.424043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.425134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.426359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.426408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.426753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.432787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.432848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.433599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.433646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.433916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.435529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.435585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.436194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.436244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.436515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.440794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.440851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.442845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.442897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.443293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.445294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.445349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.446816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.446866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.447201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.451611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.451677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.453527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.453579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.453850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.455709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.455765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.456315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.456362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.456634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.460221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.460277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.460782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.460827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.461108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.463144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.463202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.465196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.465253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.465527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.470149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.470209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.471306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.471353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.471623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.472132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.472188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.473976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.474023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.474495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.480708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.480770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.481205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.481257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.481532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.483374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.483430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.484857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.484904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.485243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.489932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.489990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.490376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.490422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.490691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.492422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.492477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.494192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.495941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.496275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.501663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.501721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.502836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.502884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.503162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.503327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.690 [2024-07-15 23:02:17.503721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.503771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.505738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.506253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.511971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.512029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.512070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.512121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.512594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.512748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.512793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.514486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.514549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.514820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.520091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.520148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.520190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.520233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.520503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.521607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.521660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.521701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.521743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.522024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.526967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.527008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.527275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.531428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.531481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.531523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.531570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.531840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.532009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.532083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.532125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.532165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.532603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.536682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.536735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.536776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.536817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.537097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.537255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.537300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.537341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.537382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.537650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.541639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.541701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.541743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.541790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.542066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.542219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.542272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.542314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.542377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.542645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.545987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.546028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.546295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.547977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.548019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.548289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.549340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.549392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.549434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.549475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.549801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.549965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.550013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.550054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.550094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.550364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.551450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.551503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.551545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.551590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.691 [2024-07-15 23:02:17.551937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.552095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.552153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.552195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.552239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.552507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.553447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.553498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.553539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.553589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.553962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.554117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.554162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.554203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.554244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.554592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.555618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.555669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.555709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.555750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.556030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.556184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.556230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.556271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.556312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.556583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.557650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.557702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.557742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.557783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.558061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.558219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.558269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.558310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.558350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.558624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.559566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.559617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.559667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.559715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.560086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.560237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.560282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.560323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.560367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.560720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.561713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.561764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.561805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.561845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.562254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.562409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.562454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.562498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.562542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.562808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.563834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.563885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.563934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.563975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.564316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.564470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.564515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.564561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.564602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.564944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.565877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.565938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.565982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.566023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.566348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.567823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.567879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.569505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.569557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.569830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.571002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.571053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.571094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.571138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.571411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.573013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.573070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.574884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.574936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.575212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.577717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.577775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.579139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.579186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.579514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.581177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.581233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.581277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.692 [2024-07-15 23:02:17.581318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.581587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.583559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.583616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.584987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.585033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.585307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.587354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.587417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.587465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.587506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.587776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.590438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.590496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.591821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.591866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.592200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.592353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.693 [2024-07-15 23:02:17.592398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.952 [2024-07-15 23:02:17.593696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.593746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.594022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.596210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.596264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.596702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.596747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.597025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.598814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.598869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.600255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.600305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.600576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.602847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.602905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.602953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.604549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.604825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.604984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.606433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.606482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.606526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.606857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.611670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.611725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.613177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.613224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.613558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.615076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.615129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.615171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.616900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.617191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.619872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.621277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.621326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.622983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.623257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.623408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.624082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.624131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.625668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.625978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.626962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.628619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.628666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.630053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.630324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.630474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.631711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.631760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.633308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.633650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.634531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.635912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.635966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.637067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.637342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.637505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.637896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.637952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.639924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.640423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.641305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.643103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.643152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.645136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.645407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.645559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.646028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.646076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.647934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.648208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.649144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.651109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.651160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.652616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.652894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.653052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.654157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.654204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.655246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.655612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.656579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.657963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.658011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.659719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.659997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.660148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.661171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.661219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.662574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.662844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.663789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.665198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.665250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.667231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.667728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.953 [2024-07-15 23:02:17.667878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.669857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.669902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.669964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.670477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.671359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.672989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.673041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.674975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.675249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.675902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.675963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.677888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.677936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.678204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.679153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.681149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.682538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.683490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.683826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.685474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.685868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.685913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.687896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.688254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.689128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.691126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.691654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.693458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.693730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.693881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.695242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.696615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.698249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.698521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.701185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.701583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.703303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.704671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.704954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.706973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.708969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.709518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.711301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.711573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.714322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.715436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.716686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.717299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.717570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.718077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.719460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.720831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.722303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.722577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.724808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.726166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.727602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.729577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.729879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.731077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.732791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.733183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.735162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.735600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.738208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.739603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.741337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.742506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.742782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.744223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.745869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.747858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.749338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.749676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.754097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.756088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.757645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.759008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.759281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.761136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.761945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.763476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.765456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.765789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.769702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.771409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.771797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.773396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.773739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.775326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.777294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.779276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.779668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.779945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.782754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.784586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.785519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.787038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.787484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.789570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.954 [2024-07-15 23:02:17.789975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.791382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.792737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.793016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.795034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.796451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.797823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.799275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.799549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.800076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.801995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.802387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.804226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.804646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.806711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.808434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.809685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.810976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.811329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.813201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.815129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.816515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.817483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.817755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.819942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.820341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.822089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.823301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.823576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.824690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.826064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.827851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.829161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.829463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.831091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.832869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.834545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.835198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.835530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.836788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.838774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.839789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.841068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.841451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.842985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.844682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.846459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.847015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.847364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.848473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.850451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.851722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.852797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.853139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.854556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.856509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.856902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.858236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:32.955 [2024-07-15 23:02:17.858588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.216 [2024-07-15 23:02:17.860565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.216 [2024-07-15 23:02:17.861464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.216 [2024-07-15 23:02:17.862945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.216 [2024-07-15 23:02:17.863335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.863607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.866143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.867452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.868491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.869432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.869708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.871348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.872272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.874030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.874418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.874691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.877268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.879248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.879639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.880111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.880385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.880540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.882313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.882361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.882815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.883094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.885169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.885566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.885973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.887951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.888358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.888512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.889983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.890031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.890709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.890993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.892217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.892622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.892671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.893069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.893402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.893556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.894949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.895427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.897292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.897816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.899655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.900236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.900285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.901782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.902117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.902268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.902664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.903065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.903459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.903827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.905052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.906945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.906993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.907380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.907764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.908272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.908671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.908719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.910560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.911068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.911952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.912352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.912401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.912806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.913231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.913387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.915141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.915188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.915590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.915866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.919063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.920531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.920930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.920979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.921354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.923362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.923418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.923802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.925780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.926269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.929255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.931019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.931069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.931951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.932234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.932388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.934240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.934734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.934779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.935057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.936403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.936461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.937998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.938044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.938394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.940030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.940094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.217 [2024-07-15 23:02:17.941177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.941225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.941500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.942872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.942936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.943329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.943378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.943728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.945398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.945460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.945848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.945899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.946179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.948344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.948401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.948790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.948843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.949232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.949740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.949796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.951310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.951356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.951727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.952965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.953021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.953417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.953465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.953793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.954584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.954640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.956077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.956121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.956492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.957838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.957904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.958306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.958355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.958743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.960804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.960861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.961259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.961302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.961572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.962912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.962977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.963368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.963417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.963745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.965230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.965284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.966003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.966049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.966320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.968011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.968075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.968464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.968511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.968782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.969288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.969341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.971306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.971700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.972037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.974399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.974456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.975255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.975300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.975574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.975728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.976748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.976794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.977188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.977577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.980117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.980174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.980217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.980258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.980541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.980692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.980742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.981831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.981876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.982165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.983784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.983843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.983884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.983932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.984209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.984920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.984981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.985022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.985063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.985336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.986585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.986636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.986679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.986721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.987139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.987299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.987347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.218 [2024-07-15 23:02:17.987387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.987428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.987709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.988624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.988676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.988716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.988757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.989112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.989265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.989311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.989358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.989400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.989747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.990918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.990992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.991992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.993827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.994254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.995921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.996334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.997343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.997427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.997472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.997515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.997912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.998085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.998136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.998176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.998216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.998483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.999729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.999779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.999828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:17.999872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.000357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.000509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.000572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.000627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.000668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.001092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.002872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.003148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.004979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.005020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.005441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.006421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.006472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.006531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.006578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.006847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.007004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.007056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.007097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.007139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.007550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.008548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.008600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.008643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.008685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.008966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.009117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.009163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.009203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.219 [2024-07-15 23:02:18.009248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.009659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.010649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.010700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.010741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.010787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.011070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.011224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.011269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.011310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.011354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.011759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.012740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.012791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.012832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.012873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.013286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.013437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.013484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.013540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.013582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.013915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.014790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.014840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.014881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.014938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.015440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.015597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.015648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.015688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.015729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.016005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.017235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.017284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.017325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.017366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.017780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.019402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.019459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.020111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.020159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.020427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.021390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.021447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.021488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.021532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.021800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.022301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.022371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.024349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.024399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.024929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.027503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.027561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.027957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.028015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.028285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.029274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.029330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.029371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.029411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.029685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.031228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.031287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.032643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.032702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.033047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.035064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.035117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.035186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.035228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.035495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.036864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.036921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.038451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.038509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.038778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.038934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.038981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.039981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.040028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.040299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.041858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.041914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.042772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.042820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.043095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.045168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.045230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.045623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.045674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.045946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.048043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.048102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.048143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.048869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.049151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.049301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.050509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.050556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.220 [2024-07-15 23:02:18.050601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.050875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.051898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.051957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.053036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.053084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.053409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.054437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.054493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.054534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.055531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.055855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.056992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.058688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.058737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.060105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.060452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.060602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.062122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.062169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.063850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.064212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.065074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.066273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.066322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.067266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.067573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.067724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.069612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.069661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.071348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.071620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.072571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.073778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.073827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.075370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.075639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.075785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.077020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.077068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.077996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.078308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.079370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.080970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.081019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.082378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.082686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.082834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.084567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.084615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.086243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.086578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.087456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.088850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.088900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.090294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.090565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.090712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.092443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.092490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.092911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.093188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.094359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.096346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.096398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.098206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.098554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.098704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.100448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.100496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.101476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.101746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.102603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.104368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.104416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.105830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.106188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.106338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.108182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.108228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.108270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.221 [2024-07-15 23:02:18.108678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.109618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.110743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.110796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.112152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.112423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.114310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.114366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.116206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.116252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.116637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.117572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.118908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.222 [2024-07-15 23:02:18.120768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.481 [2024-07-15 23:02:18.122376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.481 [2024-07-15 23:02:18.122709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.481 [2024-07-15 23:02:18.124745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.481 [2024-07-15 23:02:18.125144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.481 [2024-07-15 23:02:18.125208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.481 [2024-07-15 23:02:18.127183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.481 [2024-07-15 23:02:18.127639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.128500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.130475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.132112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.133821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.134167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.134316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.136010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.137390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.138781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.139056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.141890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.142301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.143994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.144648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.144924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.146569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.148556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.150120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.151847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.152194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.154941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.156709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.158092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.159214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.159488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.160130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.161468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.162476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.163064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.163336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.165656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.167201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.168912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.169431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.169704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.170286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.171662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.172632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.173026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.173299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.176166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.176568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.178388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.178880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.179160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.180163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.181135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.182504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.182893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.183174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.186051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.188049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.188445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.189541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.189820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.191339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.192226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.193188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.194566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.194981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.197324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.199266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.201261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.201766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.202049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.203710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.205104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.206347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.208338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.208796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.210018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.212018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.213565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.214967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.215242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.217036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.217486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.219375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.221117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.221447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.223513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.224227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.225850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.226244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.226516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.227972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.229368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.231174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.233155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.233569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.235837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.237426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.239375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.240621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.240916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.241620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.243335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.243724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.482 [2024-07-15 23:02:18.245470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.245795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.248348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.249114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.250917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.252277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.252598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.254503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.256406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.257341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.258787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.259202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.260988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.262373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.264344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.266007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.266282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.267470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.269024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.270386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.271766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.272057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.274848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.275266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.276675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.277613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.277930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.279840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.281692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.283082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.284805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.285086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.287817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.289262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.290653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.292078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.292368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.293235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.294334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.295582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.295973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.296245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.299122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.301115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.301609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.303426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.303701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.305165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.306566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.308393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.309741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.310024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.311370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.312664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.314026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.315638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.315916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.316073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.317921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.317973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.318438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.318708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.321501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.323327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.324264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.325737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.326169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.326328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.328146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.328192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.328576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.328847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.329765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.331756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.331810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.333788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.334177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.334331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.336079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.337960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.339340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.339679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.340621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.341807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.341854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.342727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.343005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.343157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.343549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.345544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.347018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.347356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.348217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.348636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.348683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.350388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.350668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.352137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.353539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.353587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.483 [2024-07-15 23:02:18.355079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.355353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.356329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.357868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.357916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.358307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.358582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.358731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.360297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.360346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.361707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.361983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.363000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.364508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.366493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.366543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.366814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.368308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.368364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.369407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.371379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.371879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.373102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.375097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.375150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.376715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.377074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.377229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.378973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.380380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.380425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.380697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.382918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.382979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.384302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.384348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.384619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.386367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.386420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.386917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.386969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.484 [2024-07-15 23:02:18.387240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.390043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.390100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.391431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.391477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.391821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.393582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.393637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.395271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.395315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.395608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.397818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.397874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.399243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.399290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.399560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.401218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.401272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.401809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.401855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.402132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.405144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.405205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.406898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.406947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.407275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.409117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.409172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.410291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.410335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.410606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.412985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.413043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.414411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.414456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.414800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.416881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.416947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.417333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.417395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.417666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.420206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.420266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.422198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.422242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.422514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.424353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.424407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.425019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.425074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.425346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.428230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.428293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.430193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.430239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.430573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.432066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.432120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.432819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.434464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.434931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.437478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.437542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.439524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.439582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.439969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.440127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.442114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.442166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.444010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.444344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.446096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.446153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.446196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.446240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.446513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.446663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.446709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.447275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.447327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.447599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.449638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.449709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.449750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.449791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.450180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.452075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.452132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.452182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.452224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.452496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.453337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.746 [2024-07-15 23:02:18.453388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.453444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.453487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.453756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.453910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.453963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.454006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.454046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.454524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.455390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.455441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.455482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.455523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.455797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.455959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.456004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.456046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.456091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.456364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.457984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.458037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.458308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.459257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.459307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.459357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.459405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.459869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.460033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.460079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.460120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.460165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.460434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.461722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.461773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.461814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.461854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.462192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.462347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.462392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.462433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.462474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.462849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.463723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.463778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.463819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.463860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.464139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.464295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.464343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.464384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.464425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.464703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.465624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.465675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.465716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.465757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.466062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.466214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.466260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.466302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.466343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.466673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.467629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.467686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.467731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.467776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.468053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.468202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.468270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.468323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.468368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.468638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.469575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.469626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.469675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.469718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.470195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.470354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.470403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.470444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.470485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.470749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.471986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.472822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.747 [2024-07-15 23:02:18.473165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.474734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.475010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.475994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.476689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.477029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.477960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.478984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.480885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.481196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.482399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.482457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.482498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.482538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.482810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.483314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.483365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.484221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.484271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.484539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.485479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.485538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.485580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.485645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.485912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.486435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.486491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.486876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.486924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.487199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.489654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.489728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.490121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.490177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.490449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.491014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.491075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.491117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.491158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.491583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.492834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.492908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.493305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.493353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.493720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.494222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.494277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.494319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.494360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.494630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.496211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.496269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.496657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.496704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.497128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.497281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.497327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.497852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.497897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.498172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.499654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.499717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.500117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.500183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.500672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.502362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.502417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.502949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.502995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.503351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.748 [2024-07-15 23:02:18.504798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.504855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.504903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.506204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.506485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.506634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.507038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.507091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.507149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.507553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.508661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.508716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.510066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.510114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.510444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.511114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.511169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.511211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.512560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.512832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.513835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.514772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.514823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.516192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.516465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.516615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.517021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.517072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.517748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.518024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.519348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.520063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.520120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.520812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.521087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.521237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.523236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.523284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.524899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.525225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.526153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.526553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.526601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.526997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.527415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.527569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.529555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.529607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.531164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.531528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.532513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.533791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.533838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.534229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.534577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.534727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.535783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.535834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.536895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.537234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.538640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.539061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.539113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.540495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.540828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.540990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.541384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.541435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.541824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.542370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.543565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.543986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.544044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.544429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.544896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.545055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.545455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.545512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.545902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.546299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.547412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.547813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.547862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.548268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.548693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.548843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.549249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.549302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.549344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.549740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.550793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.551210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.551262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.551651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.551945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.552435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.552505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.552899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.552957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.553354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.749 [2024-07-15 23:02:18.554464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.554861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.555269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.555662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.556091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.556584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.556997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.557046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.557432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.557855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.559038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.559439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.559834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.560241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.560640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.560794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.561199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.561594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.561997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.562367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.563829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.564244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.564637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.565040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.565431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.565936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.566334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.566727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.567140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.567505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.569075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.569475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.569867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.570274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.570762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.571262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.571661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.572064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.573431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.573763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.575193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.576521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.577506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.577899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.578331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.578823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.580579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.581166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.582163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.582437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.584274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.585619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.587437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.587965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.588481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.590525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.590936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.592014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.593248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.593522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.595318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.597260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.597651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.598048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.598528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.600400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.601258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.602829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.603723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.604037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.605420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.605817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.606781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.608153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.608427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.609816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.610819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.612457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.613303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.613575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.615271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.616409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.617785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.619523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.750 [2024-07-15 23:02:18.619888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.621697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.623164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.624029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.624439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.624864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.627478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.628428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.629601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.630744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.631092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.631586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.631989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.633981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.634373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.634646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.636132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.636536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.638526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.640266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.640541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.641659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.643350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.644559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.646059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.646332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.647794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:33.751 [2024-07-15 23:02:18.649716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.650954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.652305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.652579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.654612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.655026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.656970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.658645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.658982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.660352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.660748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.661141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.662865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.663203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.664711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.666543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.668540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.669155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.669427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.671968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.673966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.674363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.674749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.675131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.677231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.678938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.680330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.682072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.682348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.685223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.686897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.688278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.689480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.689984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.690485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.691659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.693028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.694753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.695032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.697295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.698662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.700097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.702038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.702313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.703239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.703635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.704046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.705742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.706080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.707356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.709355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.710857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.712256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.712534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.712685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.714557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.714604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.714998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.715335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.717567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.717980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.719978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.720372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.720830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.720993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.721387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.721435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.723419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.723762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.724638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.013 [2024-07-15 23:02:18.725052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.725119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.725505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.725981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.726140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.727965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.728357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.729415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.729692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.730636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.731216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.731267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.732128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.732404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.732557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.734353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.734745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.735905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.736191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.737346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.738767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.738818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.740340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.740617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.742224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.742825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.742874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.744388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.744665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.745863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.746280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.746327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.747993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.748325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.748481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.749879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.749936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.751647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.751919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.752946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.754782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.756188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.756236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.756576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.757089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.757143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.757527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.759306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.759669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.762381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.762825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.762882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.764764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.765099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.765256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.766600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.768076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.768126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.768590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.771013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.771071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.772868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.772915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.773195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.775048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.775104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.775565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.775611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.775883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.778526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.778585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.780365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.780413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.780835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.781342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.781396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.782480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.782527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.782859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.785469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.785526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.786305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.786350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.786625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.788078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.788133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.789513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.789560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.789840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.791225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.791281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.791782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.791827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.792109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.793729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.793785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.795164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.795210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.795483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.014 [2024-07-15 23:02:18.797731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.797790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.799444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.799494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.799766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.801295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.801351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.802029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.802073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.802461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.804994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.805052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.806428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.806475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.806749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.808589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.808644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.809787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.809834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.810190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.812463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.812522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.813514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.813560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.814030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.814525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.814576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.816420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.817784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.818147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.819472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.819528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.821347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.821391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.821667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.821818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.823341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.823390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.824774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.825058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.827299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.827356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.827397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.827438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.827765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.827917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.827974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.829348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.829397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.829668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.832360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.832420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.832461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.832503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.832775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.834714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.834768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.834818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.834859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.835150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.836989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.837029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.837351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.838991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.839263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.840899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.841178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.842892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.015 [2024-07-15 23:02:18.843229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.844861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.845140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.846835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.847116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.848911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.849190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.850910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.851187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.852955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.853278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.854885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.854944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.854988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.855033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.855302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.855456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.855502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.855545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.855591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.855857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.856870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.856921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.856972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.857014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.857422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.857578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.857628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.857669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.857710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.857987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.858991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.859743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.860116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.861350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.861402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.861443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.861485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.861754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.861904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.861959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.862001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.862042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.862308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.863980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.016 [2024-07-15 23:02:18.864036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.864361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.865287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.865338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.865382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.865436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.865705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.866479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.866532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.866917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.866968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.867359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.868346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.868398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.868439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.868484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.868803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.870664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.870720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.871715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.871760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.872038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.874568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.874625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.875148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.875197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.875563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.876070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.876122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.876169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.876209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.876478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.879195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.879254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.880392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.880443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.880784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.882583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.882643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.882685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.882725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.882998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.884319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.884376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.884761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.884803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.885153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.885305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.885352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.886574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.886625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.886894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.888986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.889044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.889913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.889966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.890383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.890874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.890938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.892381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.892429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.892756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.895735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.895797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.895847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.897447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.897795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.897956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.898349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.898393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.898438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.898797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.899838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.899889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.901111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.901156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.901482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.902973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.903028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.903077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.905040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.905428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.906477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.907821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.017 [2024-07-15 23:02:18.907873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.908644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.908985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.909135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.910342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.910407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.912396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.912754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.914011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.915947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.916000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.917109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.917388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.018 [2024-07-15 23:02:18.917539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.918291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.918343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.919723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.920017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.921343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.921741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.921785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.923774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.924117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.924271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.925124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.925177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.926469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.926776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.927779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.276 [2024-07-15 23:02:18.927854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.277 [2024-07-15 23:02:18.928677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.277 [2024-07-15 23:02:18.928730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.277 [2024-07-15 23:02:18.929126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.277 [2024-07-15 23:02:18.930359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.277 [2024-07-15 23:02:18.930452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:34.843 00:33:34.843 Latency(us) 00:33:34.843 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:34.843 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x0 length 0x100 00:33:34.843 crypto_ram : 5.84 43.82 2.74 0.00 0.00 2837227.97 73856.22 2567643.49 00:33:34.843 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x100 length 0x100 00:33:34.843 crypto_ram : 6.05 39.80 2.49 0.00 0.00 3061474.93 132211.76 3151198.83 00:33:34.843 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x0 length 0x100 00:33:34.843 crypto_ram2 : 5.84 43.81 2.74 0.00 0.00 2729586.87 73400.32 2567643.49 00:33:34.843 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x100 length 0x100 00:33:34.843 crypto_ram2 : 6.07 41.87 2.62 0.00 0.00 2835482.68 46274.11 3151198.83 00:33:34.843 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x0 length 0x100 00:33:34.843 crypto_ram3 : 5.59 276.25 17.27 0.00 0.00 411354.21 22681.15 598144.22 00:33:34.843 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x100 length 0x100 00:33:34.843 crypto_ram3 : 5.70 224.25 14.02 0.00 0.00 497799.63 82974.27 620027.55 00:33:34.843 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x0 length 0x100 00:33:34.843 crypto_ram4 : 5.70 295.00 18.44 0.00 0.00 373217.40 6553.60 517905.36 00:33:34.843 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:34.843 Verification LBA range: start 0x100 length 0x100 00:33:34.843 crypto_ram4 : 5.85 242.53 15.16 0.00 0.00 445790.05 4616.01 638263.65 00:33:34.843 =================================================================================================================== 00:33:34.843 Total : 1207.32 75.46 0.00 0.00 781348.46 4616.01 3151198.83 00:33:35.102 00:33:35.102 real 0m9.273s 00:33:35.102 user 0m17.584s 00:33:35.102 sys 0m0.460s 00:33:35.102 23:02:19 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:35.102 23:02:19 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:35.102 ************************************ 00:33:35.102 END TEST bdev_verify_big_io 00:33:35.102 ************************************ 00:33:35.102 23:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:35.102 23:02:19 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:35.102 23:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:35.102 23:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:35.102 23:02:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:35.361 ************************************ 00:33:35.361 START TEST bdev_write_zeroes 00:33:35.361 ************************************ 00:33:35.361 23:02:20 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:35.361 [2024-07-15 23:02:20.099894] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:35.361 [2024-07-15 23:02:20.099973] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2892819 ] 00:33:35.361 [2024-07-15 23:02:20.229877] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:35.622 [2024-07-15 23:02:20.332369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:35.622 [2024-07-15 23:02:20.353670] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:35.622 [2024-07-15 23:02:20.361697] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:35.622 [2024-07-15 23:02:20.369717] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:35.622 [2024-07-15 23:02:20.481909] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:38.167 [2024-07-15 23:02:22.709491] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:38.167 [2024-07-15 23:02:22.709565] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:38.168 [2024-07-15 23:02:22.709580] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.168 [2024-07-15 23:02:22.717509] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:38.168 [2024-07-15 23:02:22.717530] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:38.168 [2024-07-15 23:02:22.717542] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.168 [2024-07-15 23:02:22.725530] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:38.168 [2024-07-15 23:02:22.725548] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:38.168 [2024-07-15 23:02:22.725560] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.168 [2024-07-15 23:02:22.733549] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:38.168 [2024-07-15 23:02:22.733567] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:38.168 [2024-07-15 23:02:22.733578] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.168 Running I/O for 1 seconds... 00:33:39.107 00:33:39.107 Latency(us) 00:33:39.107 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:39.107 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:39.107 crypto_ram : 1.03 1956.01 7.64 0.00 0.00 64897.46 5499.33 77959.35 00:33:39.107 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:39.107 crypto_ram2 : 1.03 1961.76 7.66 0.00 0.00 64355.67 5470.83 72488.51 00:33:39.107 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:39.107 crypto_ram3 : 1.02 15040.87 58.75 0.00 0.00 8380.20 2493.22 10884.67 00:33:39.107 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:39.107 crypto_ram4 : 1.02 15078.15 58.90 0.00 0.00 8333.40 2493.22 8719.14 00:33:39.107 =================================================================================================================== 00:33:39.107 Total : 34036.80 132.96 0.00 0.00 14861.55 2493.22 77959.35 00:33:39.366 00:33:39.366 real 0m4.203s 00:33:39.366 user 0m3.765s 00:33:39.366 sys 0m0.391s 00:33:39.366 23:02:24 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:39.366 23:02:24 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:39.366 ************************************ 00:33:39.366 END TEST bdev_write_zeroes 00:33:39.366 ************************************ 00:33:39.624 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:39.624 23:02:24 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:39.624 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:39.624 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:39.624 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:39.624 ************************************ 00:33:39.624 START TEST bdev_json_nonenclosed 00:33:39.625 ************************************ 00:33:39.625 23:02:24 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:39.625 [2024-07-15 23:02:24.383421] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:39.625 [2024-07-15 23:02:24.383470] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2893413 ] 00:33:39.625 [2024-07-15 23:02:24.495330] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:39.884 [2024-07-15 23:02:24.601186] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:39.884 [2024-07-15 23:02:24.601260] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:39.884 [2024-07-15 23:02:24.601282] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:39.884 [2024-07-15 23:02:24.601295] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:39.884 00:33:39.884 real 0m0.377s 00:33:39.884 user 0m0.231s 00:33:39.884 sys 0m0.142s 00:33:39.884 23:02:24 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:39.884 23:02:24 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:39.884 23:02:24 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:39.884 ************************************ 00:33:39.884 END TEST bdev_json_nonenclosed 00:33:39.884 ************************************ 00:33:39.884 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:39.884 23:02:24 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:33:39.884 23:02:24 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:39.884 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:39.884 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:39.884 23:02:24 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:40.142 ************************************ 00:33:40.142 START TEST bdev_json_nonarray 00:33:40.142 ************************************ 00:33:40.142 23:02:24 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:40.142 [2024-07-15 23:02:24.858152] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:40.142 [2024-07-15 23:02:24.858220] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2893521 ] 00:33:40.142 [2024-07-15 23:02:24.990701] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:40.400 [2024-07-15 23:02:25.095215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:40.400 [2024-07-15 23:02:25.095299] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:40.400 [2024-07-15 23:02:25.095320] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:40.400 [2024-07-15 23:02:25.095334] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:40.400 00:33:40.400 real 0m0.411s 00:33:40.400 user 0m0.239s 00:33:40.400 sys 0m0.168s 00:33:40.400 23:02:25 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:40.400 23:02:25 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:40.400 23:02:25 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:40.400 ************************************ 00:33:40.400 END TEST bdev_json_nonarray 00:33:40.400 ************************************ 00:33:40.400 23:02:25 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:33:40.400 23:02:25 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:33:40.400 00:33:40.400 real 1m13.330s 00:33:40.400 user 2m42.602s 00:33:40.400 sys 0m9.281s 00:33:40.400 23:02:25 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:40.400 23:02:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:40.400 ************************************ 00:33:40.400 END TEST blockdev_crypto_aesni 00:33:40.401 ************************************ 00:33:40.401 23:02:25 -- common/autotest_common.sh@1142 -- # return 0 00:33:40.401 23:02:25 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:40.658 23:02:25 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:40.658 23:02:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:40.658 23:02:25 -- common/autotest_common.sh@10 -- # set +x 00:33:40.658 ************************************ 00:33:40.658 START TEST blockdev_crypto_sw 00:33:40.658 ************************************ 00:33:40.658 23:02:25 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:40.658 * Looking for test storage... 00:33:40.658 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:40.658 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:40.658 23:02:25 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:33:40.658 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:40.658 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:40.658 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:40.658 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:40.658 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2893594 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:40.659 23:02:25 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2893594 00:33:40.659 23:02:25 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 2893594 ']' 00:33:40.659 23:02:25 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:40.659 23:02:25 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:40.659 23:02:25 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:40.659 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:40.659 23:02:25 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:40.659 23:02:25 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:40.659 [2024-07-15 23:02:25.546607] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:40.659 [2024-07-15 23:02:25.546684] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2893594 ] 00:33:40.916 [2024-07-15 23:02:25.678098] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:40.916 [2024-07-15 23:02:25.787087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:41.848 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:41.848 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:33:41.848 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:41.848 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:33:41.848 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:33:41.848 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:41.848 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:41.848 Malloc0 00:33:41.848 Malloc1 00:33:41.848 true 00:33:41.848 true 00:33:42.106 true 00:33:42.106 [2024-07-15 23:02:26.760094] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:42.106 crypto_ram 00:33:42.106 [2024-07-15 23:02:26.768141] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:42.106 crypto_ram2 00:33:42.106 [2024-07-15 23:02:26.776151] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:42.106 crypto_ram3 00:33:42.106 [ 00:33:42.106 { 00:33:42.106 "name": "Malloc1", 00:33:42.106 "aliases": [ 00:33:42.106 "10bc032a-debd-4a09-8400-72b16c340025" 00:33:42.106 ], 00:33:42.106 "product_name": "Malloc disk", 00:33:42.106 "block_size": 4096, 00:33:42.106 "num_blocks": 4096, 00:33:42.106 "uuid": "10bc032a-debd-4a09-8400-72b16c340025", 00:33:42.106 "assigned_rate_limits": { 00:33:42.106 "rw_ios_per_sec": 0, 00:33:42.106 "rw_mbytes_per_sec": 0, 00:33:42.106 "r_mbytes_per_sec": 0, 00:33:42.106 "w_mbytes_per_sec": 0 00:33:42.106 }, 00:33:42.106 "claimed": true, 00:33:42.106 "claim_type": "exclusive_write", 00:33:42.106 "zoned": false, 00:33:42.106 "supported_io_types": { 00:33:42.106 "read": true, 00:33:42.106 "write": true, 00:33:42.106 "unmap": true, 00:33:42.106 "flush": true, 00:33:42.106 "reset": true, 00:33:42.106 "nvme_admin": false, 00:33:42.106 "nvme_io": false, 00:33:42.106 "nvme_io_md": false, 00:33:42.106 "write_zeroes": true, 00:33:42.106 "zcopy": true, 00:33:42.106 "get_zone_info": false, 00:33:42.106 "zone_management": false, 00:33:42.106 "zone_append": false, 00:33:42.107 "compare": false, 00:33:42.107 "compare_and_write": false, 00:33:42.107 "abort": true, 00:33:42.107 "seek_hole": false, 00:33:42.107 "seek_data": false, 00:33:42.107 "copy": true, 00:33:42.107 "nvme_iov_md": false 00:33:42.107 }, 00:33:42.107 "memory_domains": [ 00:33:42.107 { 00:33:42.107 "dma_device_id": "system", 00:33:42.107 "dma_device_type": 1 00:33:42.107 }, 00:33:42.107 { 00:33:42.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:42.107 "dma_device_type": 2 00:33:42.107 } 00:33:42.107 ], 00:33:42.107 "driver_specific": {} 00:33:42.107 } 00:33:42.107 ] 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:42.107 23:02:26 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:42.107 23:02:26 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "af376f00-a67c-540a-a6d7-5ba2ca9c7d18"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "af376f00-a67c-540a-a6d7-5ba2ca9c7d18",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "197ce0fc-67b8-5690-a201-fd174c1c15f2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "197ce0fc-67b8-5690-a201-fd174c1c15f2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:42.107 23:02:27 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:42.107 23:02:27 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:42.107 23:02:27 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:42.107 23:02:27 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 2893594 00:33:42.107 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 2893594 ']' 00:33:42.107 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 2893594 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2893594 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2893594' 00:33:42.366 killing process with pid 2893594 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 2893594 00:33:42.366 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 2893594 00:33:42.625 23:02:27 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:42.625 23:02:27 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:42.625 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:42.625 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:42.625 23:02:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:42.625 ************************************ 00:33:42.625 START TEST bdev_hello_world 00:33:42.625 ************************************ 00:33:42.625 23:02:27 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:42.884 [2024-07-15 23:02:27.554591] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:42.884 [2024-07-15 23:02:27.554656] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2893957 ] 00:33:42.884 [2024-07-15 23:02:27.686610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:42.884 [2024-07-15 23:02:27.791352] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:43.143 [2024-07-15 23:02:27.962636] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:43.143 [2024-07-15 23:02:27.962705] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:43.143 [2024-07-15 23:02:27.962720] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:43.143 [2024-07-15 23:02:27.970654] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:43.143 [2024-07-15 23:02:27.970673] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:43.143 [2024-07-15 23:02:27.970685] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:43.143 [2024-07-15 23:02:27.978677] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:43.143 [2024-07-15 23:02:27.978696] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:43.143 [2024-07-15 23:02:27.978708] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:43.143 [2024-07-15 23:02:28.020527] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:43.143 [2024-07-15 23:02:28.020571] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:43.143 [2024-07-15 23:02:28.020589] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:43.143 [2024-07-15 23:02:28.022652] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:43.143 [2024-07-15 23:02:28.022731] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:43.143 [2024-07-15 23:02:28.022747] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:43.143 [2024-07-15 23:02:28.022780] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:43.143 00:33:43.143 [2024-07-15 23:02:28.022798] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:43.402 00:33:43.402 real 0m0.761s 00:33:43.402 user 0m0.514s 00:33:43.402 sys 0m0.235s 00:33:43.402 23:02:28 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:43.402 23:02:28 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:43.402 ************************************ 00:33:43.402 END TEST bdev_hello_world 00:33:43.402 ************************************ 00:33:43.402 23:02:28 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:43.402 23:02:28 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:43.402 23:02:28 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:43.402 23:02:28 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:43.402 23:02:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:43.661 ************************************ 00:33:43.661 START TEST bdev_bounds 00:33:43.661 ************************************ 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2893984 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2893984' 00:33:43.661 Process bdevio pid: 2893984 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2893984 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2893984 ']' 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:43.661 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:43.661 23:02:28 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:43.661 [2024-07-15 23:02:28.400607] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:43.661 [2024-07-15 23:02:28.400682] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2893984 ] 00:33:43.661 [2024-07-15 23:02:28.532736] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:43.919 [2024-07-15 23:02:28.641121] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:43.919 [2024-07-15 23:02:28.641221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:43.919 [2024-07-15 23:02:28.641222] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:43.919 [2024-07-15 23:02:28.811353] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:43.919 [2024-07-15 23:02:28.811429] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:43.919 [2024-07-15 23:02:28.811445] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:43.919 [2024-07-15 23:02:28.819377] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:43.919 [2024-07-15 23:02:28.819397] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:43.919 [2024-07-15 23:02:28.819409] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.177 [2024-07-15 23:02:28.827399] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:44.177 [2024-07-15 23:02:28.827416] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:44.177 [2024-07-15 23:02:28.827428] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.435 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:44.435 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:44.435 23:02:29 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:44.693 I/O targets: 00:33:44.693 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:33:44.693 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:33:44.693 00:33:44.693 00:33:44.693 CUnit - A unit testing framework for C - Version 2.1-3 00:33:44.693 http://cunit.sourceforge.net/ 00:33:44.693 00:33:44.693 00:33:44.693 Suite: bdevio tests on: crypto_ram3 00:33:44.693 Test: blockdev write read block ...passed 00:33:44.693 Test: blockdev write zeroes read block ...passed 00:33:44.693 Test: blockdev write zeroes read no split ...passed 00:33:44.693 Test: blockdev write zeroes read split ...passed 00:33:44.693 Test: blockdev write zeroes read split partial ...passed 00:33:44.693 Test: blockdev reset ...passed 00:33:44.693 Test: blockdev write read 8 blocks ...passed 00:33:44.693 Test: blockdev write read size > 128k ...passed 00:33:44.693 Test: blockdev write read invalid size ...passed 00:33:44.693 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:44.693 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:44.693 Test: blockdev write read max offset ...passed 00:33:44.693 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:44.693 Test: blockdev writev readv 8 blocks ...passed 00:33:44.693 Test: blockdev writev readv 30 x 1block ...passed 00:33:44.693 Test: blockdev writev readv block ...passed 00:33:44.693 Test: blockdev writev readv size > 128k ...passed 00:33:44.693 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:44.693 Test: blockdev comparev and writev ...passed 00:33:44.693 Test: blockdev nvme passthru rw ...passed 00:33:44.693 Test: blockdev nvme passthru vendor specific ...passed 00:33:44.693 Test: blockdev nvme admin passthru ...passed 00:33:44.693 Test: blockdev copy ...passed 00:33:44.693 Suite: bdevio tests on: crypto_ram 00:33:44.693 Test: blockdev write read block ...passed 00:33:44.693 Test: blockdev write zeroes read block ...passed 00:33:44.693 Test: blockdev write zeroes read no split ...passed 00:33:44.693 Test: blockdev write zeroes read split ...passed 00:33:44.693 Test: blockdev write zeroes read split partial ...passed 00:33:44.693 Test: blockdev reset ...passed 00:33:44.693 Test: blockdev write read 8 blocks ...passed 00:33:44.693 Test: blockdev write read size > 128k ...passed 00:33:44.693 Test: blockdev write read invalid size ...passed 00:33:44.693 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:44.693 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:44.693 Test: blockdev write read max offset ...passed 00:33:44.693 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:44.693 Test: blockdev writev readv 8 blocks ...passed 00:33:44.693 Test: blockdev writev readv 30 x 1block ...passed 00:33:44.693 Test: blockdev writev readv block ...passed 00:33:44.693 Test: blockdev writev readv size > 128k ...passed 00:33:44.693 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:44.693 Test: blockdev comparev and writev ...passed 00:33:44.693 Test: blockdev nvme passthru rw ...passed 00:33:44.693 Test: blockdev nvme passthru vendor specific ...passed 00:33:44.693 Test: blockdev nvme admin passthru ...passed 00:33:44.693 Test: blockdev copy ...passed 00:33:44.693 00:33:44.693 Run Summary: Type Total Ran Passed Failed Inactive 00:33:44.693 suites 2 2 n/a 0 0 00:33:44.693 tests 46 46 46 0 0 00:33:44.693 asserts 260 260 260 0 n/a 00:33:44.693 00:33:44.693 Elapsed time = 0.197 seconds 00:33:44.693 0 00:33:44.693 23:02:29 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2893984 00:33:44.693 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2893984 ']' 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2893984 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2893984 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2893984' 00:33:44.694 killing process with pid 2893984 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2893984 00:33:44.694 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2893984 00:33:44.952 23:02:29 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:44.952 00:33:44.952 real 0m1.460s 00:33:44.952 user 0m3.642s 00:33:44.952 sys 0m0.400s 00:33:44.952 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:44.952 23:02:29 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:44.952 ************************************ 00:33:44.952 END TEST bdev_bounds 00:33:44.952 ************************************ 00:33:44.952 23:02:29 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:44.952 23:02:29 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:44.952 23:02:29 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:44.952 23:02:29 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:44.952 23:02:29 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:45.211 ************************************ 00:33:45.211 START TEST bdev_nbd 00:33:45.211 ************************************ 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2894249 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2894249 /var/tmp/spdk-nbd.sock 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2894249 ']' 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:45.211 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:45.211 23:02:29 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:45.211 [2024-07-15 23:02:29.958540] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:33:45.211 [2024-07-15 23:02:29.958609] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:45.211 [2024-07-15 23:02:30.081904] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:45.469 [2024-07-15 23:02:30.189175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:45.469 [2024-07-15 23:02:30.366254] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:45.469 [2024-07-15 23:02:30.366317] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:45.469 [2024-07-15 23:02:30.366331] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.469 [2024-07-15 23:02:30.374273] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:45.469 [2024-07-15 23:02:30.374292] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:45.469 [2024-07-15 23:02:30.374303] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.727 [2024-07-15 23:02:30.382294] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:45.727 [2024-07-15 23:02:30.382312] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:45.727 [2024-07-15 23:02:30.382324] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:45.986 23:02:30 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:46.245 1+0 records in 00:33:46.245 1+0 records out 00:33:46.245 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248678 s, 16.5 MB/s 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:46.245 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:46.246 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:46.813 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:46.814 1+0 records in 00:33:46.814 1+0 records out 00:33:46.814 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342643 s, 12.0 MB/s 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:46.814 { 00:33:46.814 "nbd_device": "/dev/nbd0", 00:33:46.814 "bdev_name": "crypto_ram" 00:33:46.814 }, 00:33:46.814 { 00:33:46.814 "nbd_device": "/dev/nbd1", 00:33:46.814 "bdev_name": "crypto_ram3" 00:33:46.814 } 00:33:46.814 ]' 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:46.814 { 00:33:46.814 "nbd_device": "/dev/nbd0", 00:33:46.814 "bdev_name": "crypto_ram" 00:33:46.814 }, 00:33:46.814 { 00:33:46.814 "nbd_device": "/dev/nbd1", 00:33:46.814 "bdev_name": "crypto_ram3" 00:33:46.814 } 00:33:46.814 ]' 00:33:46.814 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:47.073 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:47.073 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:47.073 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:47.073 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:47.073 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:47.073 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:47.073 23:02:31 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:47.332 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:47.590 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:47.847 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:48.105 /dev/nbd0 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:48.105 1+0 records in 00:33:48.105 1+0 records out 00:33:48.105 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256133 s, 16.0 MB/s 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:48.105 23:02:32 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:33:48.362 /dev/nbd1 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:48.362 1+0 records in 00:33:48.362 1+0 records out 00:33:48.362 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032479 s, 12.6 MB/s 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:48.362 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:48.363 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:48.619 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:48.619 { 00:33:48.619 "nbd_device": "/dev/nbd0", 00:33:48.620 "bdev_name": "crypto_ram" 00:33:48.620 }, 00:33:48.620 { 00:33:48.620 "nbd_device": "/dev/nbd1", 00:33:48.620 "bdev_name": "crypto_ram3" 00:33:48.620 } 00:33:48.620 ]' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:48.620 { 00:33:48.620 "nbd_device": "/dev/nbd0", 00:33:48.620 "bdev_name": "crypto_ram" 00:33:48.620 }, 00:33:48.620 { 00:33:48.620 "nbd_device": "/dev/nbd1", 00:33:48.620 "bdev_name": "crypto_ram3" 00:33:48.620 } 00:33:48.620 ]' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:48.620 /dev/nbd1' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:48.620 /dev/nbd1' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:48.620 256+0 records in 00:33:48.620 256+0 records out 00:33:48.620 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104719 s, 100 MB/s 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:48.620 256+0 records in 00:33:48.620 256+0 records out 00:33:48.620 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212183 s, 49.4 MB/s 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:48.620 256+0 records in 00:33:48.620 256+0 records out 00:33:48.620 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.046243 s, 22.7 MB/s 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:48.620 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:48.877 23:02:33 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:49.135 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:49.392 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:49.650 malloc_lvol_verify 00:33:49.650 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:49.909 36a539d8-00a2-4a27-91fa-da4ae60af6b1 00:33:49.909 23:02:34 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:50.475 fd6bea4a-3e5b-478a-85e2-9937a2bef07b 00:33:50.475 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:50.733 /dev/nbd0 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:50.733 mke2fs 1.46.5 (30-Dec-2021) 00:33:50.733 Discarding device blocks: 0/4096 done 00:33:50.733 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:50.733 00:33:50.733 Allocating group tables: 0/1 done 00:33:50.733 Writing inode tables: 0/1 done 00:33:50.733 Creating journal (1024 blocks): done 00:33:50.733 Writing superblocks and filesystem accounting information: 0/1 done 00:33:50.733 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:50.733 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2894249 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2894249 ']' 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2894249 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2894249 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2894249' 00:33:51.031 killing process with pid 2894249 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2894249 00:33:51.031 23:02:35 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2894249 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:51.298 00:33:51.298 real 0m6.191s 00:33:51.298 user 0m8.924s 00:33:51.298 sys 0m2.472s 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:51.298 ************************************ 00:33:51.298 END TEST bdev_nbd 00:33:51.298 ************************************ 00:33:51.298 23:02:36 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:51.298 23:02:36 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:51.298 23:02:36 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:33:51.298 23:02:36 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:33:51.298 23:02:36 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:51.298 23:02:36 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:51.298 23:02:36 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:51.298 23:02:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:51.298 ************************************ 00:33:51.298 START TEST bdev_fio 00:33:51.298 ************************************ 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:51.298 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:51.298 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:51.557 ************************************ 00:33:51.557 START TEST bdev_fio_rw_verify 00:33:51.557 ************************************ 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:51.557 23:02:36 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:51.816 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:51.816 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:51.816 fio-3.35 00:33:51.816 Starting 2 threads 00:34:04.017 00:34:04.017 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2895433: Mon Jul 15 23:02:47 2024 00:34:04.018 read: IOPS=21.7k, BW=84.9MiB/s (89.0MB/s)(849MiB/10001msec) 00:34:04.018 slat (usec): min=14, max=117, avg=20.05, stdev= 3.50 00:34:04.018 clat (usec): min=7, max=434, avg=146.62, stdev=58.10 00:34:04.018 lat (usec): min=25, max=477, avg=166.67, stdev=59.47 00:34:04.018 clat percentiles (usec): 00:34:04.018 | 50.000th=[ 143], 99.000th=[ 281], 99.900th=[ 302], 99.990th=[ 343], 00:34:04.018 | 99.999th=[ 383] 00:34:04.018 write: IOPS=26.1k, BW=102MiB/s (107MB/s)(968MiB/9480msec); 0 zone resets 00:34:04.018 slat (usec): min=14, max=404, avg=33.92, stdev= 4.36 00:34:04.018 clat (usec): min=26, max=899, avg=197.01, stdev=89.62 00:34:04.018 lat (usec): min=53, max=983, avg=230.93, stdev=91.22 00:34:04.018 clat percentiles (usec): 00:34:04.018 | 50.000th=[ 192], 99.000th=[ 388], 99.900th=[ 412], 99.990th=[ 685], 00:34:04.018 | 99.999th=[ 881] 00:34:04.018 bw ( KiB/s): min=92472, max=105584, per=94.83%, avg=99103.58, stdev=1839.40, samples=38 00:34:04.018 iops : min=23118, max=26396, avg=24775.89, stdev=459.85, samples=38 00:34:04.018 lat (usec) : 10=0.01%, 20=0.01%, 50=4.23%, 100=14.89%, 250=63.22% 00:34:04.018 lat (usec) : 500=17.64%, 750=0.02%, 1000=0.01% 00:34:04.018 cpu : usr=99.59%, sys=0.01%, ctx=32, majf=0, minf=461 00:34:04.018 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:04.018 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:04.018 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:04.018 issued rwts: total=217338,247689,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:04.018 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:04.018 00:34:04.018 Run status group 0 (all jobs): 00:34:04.018 READ: bw=84.9MiB/s (89.0MB/s), 84.9MiB/s-84.9MiB/s (89.0MB/s-89.0MB/s), io=849MiB (890MB), run=10001-10001msec 00:34:04.018 WRITE: bw=102MiB/s (107MB/s), 102MiB/s-102MiB/s (107MB/s-107MB/s), io=968MiB (1015MB), run=9480-9480msec 00:34:04.018 00:34:04.018 real 0m11.106s 00:34:04.018 user 0m23.649s 00:34:04.018 sys 0m0.364s 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:04.018 ************************************ 00:34:04.018 END TEST bdev_fio_rw_verify 00:34:04.018 ************************************ 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "af376f00-a67c-540a-a6d7-5ba2ca9c7d18"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "af376f00-a67c-540a-a6d7-5ba2ca9c7d18",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "197ce0fc-67b8-5690-a201-fd174c1c15f2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "197ce0fc-67b8-5690-a201-fd174c1c15f2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:34:04.018 crypto_ram3 ]] 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "af376f00-a67c-540a-a6d7-5ba2ca9c7d18"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "af376f00-a67c-540a-a6d7-5ba2ca9c7d18",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "197ce0fc-67b8-5690-a201-fd174c1c15f2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "197ce0fc-67b8-5690-a201-fd174c1c15f2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:04.018 ************************************ 00:34:04.018 START TEST bdev_fio_trim 00:34:04.018 ************************************ 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:04.018 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:04.019 23:02:47 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:04.019 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:04.019 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:04.019 fio-3.35 00:34:04.019 Starting 2 threads 00:34:13.997 00:34:13.997 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2896968: Mon Jul 15 23:02:58 2024 00:34:13.997 write: IOPS=17.3k, BW=67.5MiB/s (70.7MB/s)(675MiB/10001msec); 0 zone resets 00:34:13.997 slat (usec): min=33, max=222, avg=50.71, stdev= 9.45 00:34:13.997 clat (usec): min=97, max=3256, avg=381.03, stdev=210.10 00:34:13.997 lat (usec): min=134, max=3297, avg=431.73, stdev=217.50 00:34:13.997 clat percentiles (usec): 00:34:13.997 | 50.000th=[ 306], 99.000th=[ 783], 99.900th=[ 824], 99.990th=[ 1106], 00:34:13.997 | 99.999th=[ 3195] 00:34:13.997 bw ( KiB/s): min=68736, max=69728, per=100.00%, avg=69128.00, stdev=127.15, samples=38 00:34:13.997 iops : min=17184, max=17432, avg=17282.00, stdev=31.79, samples=38 00:34:13.997 trim: IOPS=17.3k, BW=67.5MiB/s (70.7MB/s)(675MiB/10001msec); 0 zone resets 00:34:13.997 slat (usec): min=13, max=2889, avg=22.40, stdev= 8.51 00:34:13.997 clat (usec): min=104, max=3298, avg=254.45, stdev=75.75 00:34:13.997 lat (usec): min=122, max=3318, avg=276.85, stdev=76.28 00:34:13.997 clat percentiles (usec): 00:34:13.997 | 50.000th=[ 255], 99.000th=[ 408], 99.900th=[ 433], 99.990th=[ 570], 00:34:13.997 | 99.999th=[ 3228] 00:34:13.997 bw ( KiB/s): min=68736, max=69728, per=100.00%, avg=69129.68, stdev=127.48, samples=38 00:34:13.997 iops : min=17184, max=17432, avg=17282.42, stdev=31.87, samples=38 00:34:13.997 lat (usec) : 100=0.01%, 250=42.68%, 500=38.46%, 750=17.69%, 1000=1.16% 00:34:13.997 lat (msec) : 2=0.01%, 4=0.01% 00:34:13.997 cpu : usr=99.31%, sys=0.01%, ctx=17, majf=0, minf=247 00:34:13.997 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:13.997 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:13.997 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:13.997 issued rwts: total=0,172715,172716,0 short=0,0,0,0 dropped=0,0,0,0 00:34:13.997 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:13.997 00:34:13.997 Run status group 0 (all jobs): 00:34:13.997 WRITE: bw=67.5MiB/s (70.7MB/s), 67.5MiB/s-67.5MiB/s (70.7MB/s-70.7MB/s), io=675MiB (707MB), run=10001-10001msec 00:34:13.997 TRIM: bw=67.5MiB/s (70.7MB/s), 67.5MiB/s-67.5MiB/s (70.7MB/s-70.7MB/s), io=675MiB (707MB), run=10001-10001msec 00:34:13.997 00:34:13.997 real 0m11.206s 00:34:13.997 user 0m23.869s 00:34:13.997 sys 0m0.409s 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:13.997 ************************************ 00:34:13.997 END TEST bdev_fio_trim 00:34:13.997 ************************************ 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:13.997 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:13.997 00:34:13.997 real 0m22.672s 00:34:13.997 user 0m47.722s 00:34:13.997 sys 0m0.951s 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:13.997 23:02:58 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:13.997 ************************************ 00:34:13.997 END TEST bdev_fio 00:34:13.997 ************************************ 00:34:13.997 23:02:58 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:13.997 23:02:58 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:13.997 23:02:58 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:13.997 23:02:58 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:13.997 23:02:58 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:13.997 23:02:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:14.256 ************************************ 00:34:14.256 START TEST bdev_verify 00:34:14.256 ************************************ 00:34:14.256 23:02:58 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:14.256 [2024-07-15 23:02:58.983068] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:14.256 [2024-07-15 23:02:58.983136] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2898225 ] 00:34:14.256 [2024-07-15 23:02:59.112873] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:14.514 [2024-07-15 23:02:59.212916] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:14.514 [2024-07-15 23:02:59.212921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:14.514 [2024-07-15 23:02:59.378037] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:14.514 [2024-07-15 23:02:59.378106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:14.514 [2024-07-15 23:02:59.378122] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.514 [2024-07-15 23:02:59.386055] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:14.514 [2024-07-15 23:02:59.386076] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:14.514 [2024-07-15 23:02:59.386088] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.514 [2024-07-15 23:02:59.394079] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:14.514 [2024-07-15 23:02:59.394098] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:14.514 [2024-07-15 23:02:59.394109] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.771 Running I/O for 5 seconds... 00:34:20.034 00:34:20.034 Latency(us) 00:34:20.034 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:20.034 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:20.034 Verification LBA range: start 0x0 length 0x800 00:34:20.034 crypto_ram : 5.01 5977.00 23.35 0.00 0.00 21336.04 1652.65 23820.91 00:34:20.034 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:20.034 Verification LBA range: start 0x800 length 0x800 00:34:20.034 crypto_ram : 5.03 4864.14 19.00 0.00 0.00 26206.91 2336.50 27696.08 00:34:20.034 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:20.034 Verification LBA range: start 0x0 length 0x800 00:34:20.034 crypto_ram3 : 5.03 3003.15 11.73 0.00 0.00 42394.83 2208.28 28721.86 00:34:20.034 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:20.034 Verification LBA range: start 0x800 length 0x800 00:34:20.034 crypto_ram3 : 5.03 2440.83 9.53 0.00 0.00 52125.89 2208.28 33052.94 00:34:20.034 =================================================================================================================== 00:34:20.034 Total : 16285.12 63.61 0.00 0.00 31306.46 1652.65 33052.94 00:34:20.034 00:34:20.034 real 0m5.788s 00:34:20.034 user 0m10.886s 00:34:20.034 sys 0m0.231s 00:34:20.034 23:03:04 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:20.034 23:03:04 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:20.034 ************************************ 00:34:20.034 END TEST bdev_verify 00:34:20.034 ************************************ 00:34:20.034 23:03:04 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:20.035 23:03:04 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:20.035 23:03:04 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:20.035 23:03:04 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:20.035 23:03:04 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:20.035 ************************************ 00:34:20.035 START TEST bdev_verify_big_io 00:34:20.035 ************************************ 00:34:20.035 23:03:04 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:20.035 [2024-07-15 23:03:04.858199] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:20.035 [2024-07-15 23:03:04.858267] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2899234 ] 00:34:20.292 [2024-07-15 23:03:04.986653] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:20.292 [2024-07-15 23:03:05.094398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:20.292 [2024-07-15 23:03:05.094404] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:20.549 [2024-07-15 23:03:05.276593] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:20.549 [2024-07-15 23:03:05.276652] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:20.549 [2024-07-15 23:03:05.276667] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:20.549 [2024-07-15 23:03:05.284613] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:20.549 [2024-07-15 23:03:05.284634] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:20.549 [2024-07-15 23:03:05.284646] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:20.549 [2024-07-15 23:03:05.292637] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:20.549 [2024-07-15 23:03:05.292657] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:20.549 [2024-07-15 23:03:05.292668] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:20.549 Running I/O for 5 seconds... 00:34:25.817 00:34:25.817 Latency(us) 00:34:25.817 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:25.817 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:25.817 Verification LBA range: start 0x0 length 0x80 00:34:25.817 crypto_ram : 5.15 472.31 29.52 0.00 0.00 264684.98 6240.17 377487.36 00:34:25.817 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:25.817 Verification LBA range: start 0x80 length 0x80 00:34:25.817 crypto_ram : 5.19 394.79 24.67 0.00 0.00 315560.63 7294.44 423077.62 00:34:25.817 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:34:25.817 Verification LBA range: start 0x0 length 0x80 00:34:25.817 crypto_ram3 : 5.29 266.03 16.63 0.00 0.00 455374.55 5698.78 392076.24 00:34:25.817 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:34:25.817 Verification LBA range: start 0x80 length 0x80 00:34:25.817 crypto_ram3 : 5.34 215.88 13.49 0.00 0.00 554125.91 7151.97 441313.73 00:34:25.817 =================================================================================================================== 00:34:25.817 Total : 1349.01 84.31 0.00 0.00 364986.14 5698.78 441313.73 00:34:26.075 00:34:26.075 real 0m6.149s 00:34:26.075 user 0m11.558s 00:34:26.075 sys 0m0.260s 00:34:26.075 23:03:10 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:26.075 23:03:10 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:34:26.075 ************************************ 00:34:26.075 END TEST bdev_verify_big_io 00:34:26.075 ************************************ 00:34:26.333 23:03:10 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:26.333 23:03:10 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:26.333 23:03:10 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:26.333 23:03:10 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:26.333 23:03:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:26.333 ************************************ 00:34:26.333 START TEST bdev_write_zeroes 00:34:26.333 ************************************ 00:34:26.333 23:03:11 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:26.334 [2024-07-15 23:03:11.093550] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:26.334 [2024-07-15 23:03:11.093617] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2900327 ] 00:34:26.334 [2024-07-15 23:03:11.221935] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:26.592 [2024-07-15 23:03:11.329574] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:26.897 [2024-07-15 23:03:11.508083] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:26.897 [2024-07-15 23:03:11.508163] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:26.897 [2024-07-15 23:03:11.508178] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:26.897 [2024-07-15 23:03:11.516099] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:34:26.897 [2024-07-15 23:03:11.516119] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:26.897 [2024-07-15 23:03:11.516131] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:26.897 [2024-07-15 23:03:11.524119] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:34:26.897 [2024-07-15 23:03:11.524137] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:34:26.897 [2024-07-15 23:03:11.524149] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:26.897 Running I/O for 1 seconds... 00:34:27.868 00:34:27.868 Latency(us) 00:34:27.868 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:27.868 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:27.868 crypto_ram : 1.01 26292.18 102.70 0.00 0.00 4856.20 2122.80 6639.08 00:34:27.868 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:34:27.868 crypto_ram3 : 1.01 13175.70 51.47 0.00 0.00 9641.93 3376.53 9915.88 00:34:27.868 =================================================================================================================== 00:34:27.868 Total : 39467.88 154.17 0.00 0.00 6456.57 2122.80 9915.88 00:34:28.127 00:34:28.127 real 0m1.793s 00:34:28.127 user 0m1.526s 00:34:28.127 sys 0m0.246s 00:34:28.127 23:03:12 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:28.127 23:03:12 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:34:28.127 ************************************ 00:34:28.127 END TEST bdev_write_zeroes 00:34:28.127 ************************************ 00:34:28.127 23:03:12 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:28.127 23:03:12 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:28.127 23:03:12 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:28.127 23:03:12 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:28.127 23:03:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.127 ************************************ 00:34:28.127 START TEST bdev_json_nonenclosed 00:34:28.127 ************************************ 00:34:28.127 23:03:12 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:28.127 [2024-07-15 23:03:12.969779] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:28.127 [2024-07-15 23:03:12.969844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2900528 ] 00:34:28.384 [2024-07-15 23:03:13.099950] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:28.384 [2024-07-15 23:03:13.214583] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:28.384 [2024-07-15 23:03:13.214662] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:34:28.384 [2024-07-15 23:03:13.214686] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:28.384 [2024-07-15 23:03:13.214700] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:28.642 00:34:28.642 real 0m0.416s 00:34:28.642 user 0m0.255s 00:34:28.642 sys 0m0.157s 00:34:28.643 23:03:13 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:34:28.643 23:03:13 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:28.643 23:03:13 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:34:28.643 ************************************ 00:34:28.643 END TEST bdev_json_nonenclosed 00:34:28.643 ************************************ 00:34:28.643 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:34:28.643 23:03:13 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:34:28.643 23:03:13 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:28.643 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:34:28.643 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:28.643 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:28.643 ************************************ 00:34:28.643 START TEST bdev_json_nonarray 00:34:28.643 ************************************ 00:34:28.643 23:03:13 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:34:28.643 [2024-07-15 23:03:13.473093] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:28.643 [2024-07-15 23:03:13.473158] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2900674 ] 00:34:28.900 [2024-07-15 23:03:13.602476] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:28.900 [2024-07-15 23:03:13.709775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:28.900 [2024-07-15 23:03:13.709860] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:34:28.900 [2024-07-15 23:03:13.709883] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:34:28.900 [2024-07-15 23:03:13.709897] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:34:29.158 00:34:29.158 real 0m0.409s 00:34:29.158 user 0m0.237s 00:34:29.158 sys 0m0.170s 00:34:29.158 23:03:13 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:34:29.158 23:03:13 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:29.158 23:03:13 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:34:29.158 ************************************ 00:34:29.158 END TEST bdev_json_nonarray 00:34:29.158 ************************************ 00:34:29.158 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:34:29.158 23:03:13 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:34:29.159 23:03:13 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:34:29.159 23:03:13 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:34:29.159 23:03:13 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:34:29.159 23:03:13 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:34:29.159 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:34:29.159 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:29.159 23:03:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:29.159 ************************************ 00:34:29.159 START TEST bdev_crypto_enomem 00:34:29.159 ************************************ 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=2900740 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 2900740 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 2900740 ']' 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:29.159 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:29.159 23:03:13 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:29.159 [2024-07-15 23:03:13.968454] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:29.159 [2024-07-15 23:03:13.968523] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2900740 ] 00:34:29.417 [2024-07-15 23:03:14.103678] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:29.417 [2024-07-15 23:03:14.220334] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:30.356 true 00:34:30.356 base0 00:34:30.356 true 00:34:30.356 [2024-07-15 23:03:14.944605] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:34:30.356 crypt0 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:30.356 [ 00:34:30.356 { 00:34:30.356 "name": "crypt0", 00:34:30.356 "aliases": [ 00:34:30.356 "c94f822e-5e23-5f98-aa5b-d79a3c9850b5" 00:34:30.356 ], 00:34:30.356 "product_name": "crypto", 00:34:30.356 "block_size": 512, 00:34:30.356 "num_blocks": 2097152, 00:34:30.356 "uuid": "c94f822e-5e23-5f98-aa5b-d79a3c9850b5", 00:34:30.356 "assigned_rate_limits": { 00:34:30.356 "rw_ios_per_sec": 0, 00:34:30.356 "rw_mbytes_per_sec": 0, 00:34:30.356 "r_mbytes_per_sec": 0, 00:34:30.356 "w_mbytes_per_sec": 0 00:34:30.356 }, 00:34:30.356 "claimed": false, 00:34:30.356 "zoned": false, 00:34:30.356 "supported_io_types": { 00:34:30.356 "read": true, 00:34:30.356 "write": true, 00:34:30.356 "unmap": false, 00:34:30.356 "flush": false, 00:34:30.356 "reset": true, 00:34:30.356 "nvme_admin": false, 00:34:30.356 "nvme_io": false, 00:34:30.356 "nvme_io_md": false, 00:34:30.356 "write_zeroes": true, 00:34:30.356 "zcopy": false, 00:34:30.356 "get_zone_info": false, 00:34:30.356 "zone_management": false, 00:34:30.356 "zone_append": false, 00:34:30.356 "compare": false, 00:34:30.356 "compare_and_write": false, 00:34:30.356 "abort": false, 00:34:30.356 "seek_hole": false, 00:34:30.356 "seek_data": false, 00:34:30.356 "copy": false, 00:34:30.356 "nvme_iov_md": false 00:34:30.356 }, 00:34:30.356 "memory_domains": [ 00:34:30.356 { 00:34:30.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:30.356 "dma_device_type": 2 00:34:30.356 } 00:34:30.356 ], 00:34:30.356 "driver_specific": { 00:34:30.356 "crypto": { 00:34:30.356 "base_bdev_name": "EE_base0", 00:34:30.356 "name": "crypt0", 00:34:30.356 "key_name": "test_dek_sw" 00:34:30.356 } 00:34:30.356 } 00:34:30.356 } 00:34:30.356 ] 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=2900914 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:34:30.356 23:03:14 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:30.356 Running I/O for 5 seconds... 00:34:31.290 23:03:15 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:34:31.290 23:03:15 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.290 23:03:15 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:31.290 23:03:16 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:31.290 23:03:16 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 2900914 00:34:35.482 00:34:35.482 Latency(us) 00:34:35.482 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:35.482 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:34:35.482 crypt0 : 5.00 28127.18 109.87 0.00 0.00 1133.08 548.51 1923.34 00:34:35.482 =================================================================================================================== 00:34:35.482 Total : 28127.18 109.87 0.00 0.00 1133.08 548.51 1923.34 00:34:35.482 0 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 2900740 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 2900740 ']' 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 2900740 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2900740 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2900740' 00:34:35.482 killing process with pid 2900740 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 2900740 00:34:35.482 Received shutdown signal, test time was about 5.000000 seconds 00:34:35.482 00:34:35.482 Latency(us) 00:34:35.482 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:35.482 =================================================================================================================== 00:34:35.482 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:35.482 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 2900740 00:34:35.740 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:34:35.740 00:34:35.740 real 0m6.563s 00:34:35.740 user 0m6.795s 00:34:35.740 sys 0m0.431s 00:34:35.740 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:35.740 23:03:20 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:35.740 ************************************ 00:34:35.740 END TEST bdev_crypto_enomem 00:34:35.740 ************************************ 00:34:35.740 23:03:20 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:35.740 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:35.740 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:34:35.740 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:35.741 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:35.741 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:34:35.741 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:34:35.741 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:34:35.741 23:03:20 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:34:35.741 00:34:35.741 real 0m55.177s 00:34:35.741 user 1m34.601s 00:34:35.741 sys 0m6.813s 00:34:35.741 23:03:20 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:35.741 23:03:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:35.741 ************************************ 00:34:35.741 END TEST blockdev_crypto_sw 00:34:35.741 ************************************ 00:34:35.741 23:03:20 -- common/autotest_common.sh@1142 -- # return 0 00:34:35.741 23:03:20 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:35.741 23:03:20 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:35.741 23:03:20 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:35.741 23:03:20 -- common/autotest_common.sh@10 -- # set +x 00:34:35.741 ************************************ 00:34:35.741 START TEST blockdev_crypto_qat 00:34:35.741 ************************************ 00:34:35.741 23:03:20 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:36.000 * Looking for test storage... 00:34:36.000 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2901679 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2901679 00:34:36.000 23:03:20 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:36.000 23:03:20 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 2901679 ']' 00:34:36.000 23:03:20 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:36.000 23:03:20 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:36.000 23:03:20 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:36.000 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:36.000 23:03:20 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:36.000 23:03:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:36.000 [2024-07-15 23:03:20.861371] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:36.000 [2024-07-15 23:03:20.861509] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2901679 ] 00:34:36.258 [2024-07-15 23:03:21.056998] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:36.258 [2024-07-15 23:03:21.156974] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:36.517 23:03:21 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:36.517 23:03:21 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:34:36.517 23:03:21 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:34:36.517 23:03:21 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:34:36.517 23:03:21 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:34:36.517 23:03:21 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:36.517 23:03:21 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:36.517 [2024-07-15 23:03:21.265680] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:36.517 [2024-07-15 23:03:21.273702] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:36.517 [2024-07-15 23:03:21.281719] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:36.517 [2024-07-15 23:03:21.354179] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:39.063 true 00:34:39.063 true 00:34:39.063 true 00:34:39.063 true 00:34:39.063 Malloc0 00:34:39.063 Malloc1 00:34:39.063 Malloc2 00:34:39.063 Malloc3 00:34:39.063 [2024-07-15 23:03:23.724628] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:39.063 crypto_ram 00:34:39.063 [2024-07-15 23:03:23.732645] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:39.063 crypto_ram1 00:34:39.063 [2024-07-15 23:03:23.740667] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:39.063 crypto_ram2 00:34:39.063 [2024-07-15 23:03:23.748689] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:39.063 crypto_ram3 00:34:39.063 [ 00:34:39.063 { 00:34:39.063 "name": "Malloc1", 00:34:39.063 "aliases": [ 00:34:39.063 "d167ffd6-213b-48fd-be3c-994357af7b11" 00:34:39.063 ], 00:34:39.063 "product_name": "Malloc disk", 00:34:39.063 "block_size": 512, 00:34:39.063 "num_blocks": 65536, 00:34:39.063 "uuid": "d167ffd6-213b-48fd-be3c-994357af7b11", 00:34:39.063 "assigned_rate_limits": { 00:34:39.063 "rw_ios_per_sec": 0, 00:34:39.063 "rw_mbytes_per_sec": 0, 00:34:39.063 "r_mbytes_per_sec": 0, 00:34:39.063 "w_mbytes_per_sec": 0 00:34:39.063 }, 00:34:39.063 "claimed": true, 00:34:39.063 "claim_type": "exclusive_write", 00:34:39.063 "zoned": false, 00:34:39.063 "supported_io_types": { 00:34:39.063 "read": true, 00:34:39.063 "write": true, 00:34:39.063 "unmap": true, 00:34:39.063 "flush": true, 00:34:39.063 "reset": true, 00:34:39.063 "nvme_admin": false, 00:34:39.063 "nvme_io": false, 00:34:39.063 "nvme_io_md": false, 00:34:39.063 "write_zeroes": true, 00:34:39.063 "zcopy": true, 00:34:39.063 "get_zone_info": false, 00:34:39.063 "zone_management": false, 00:34:39.063 "zone_append": false, 00:34:39.063 "compare": false, 00:34:39.063 "compare_and_write": false, 00:34:39.063 "abort": true, 00:34:39.063 "seek_hole": false, 00:34:39.063 "seek_data": false, 00:34:39.063 "copy": true, 00:34:39.063 "nvme_iov_md": false 00:34:39.063 }, 00:34:39.063 "memory_domains": [ 00:34:39.063 { 00:34:39.063 "dma_device_id": "system", 00:34:39.063 "dma_device_type": 1 00:34:39.063 }, 00:34:39.063 { 00:34:39.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:39.063 "dma_device_type": 2 00:34:39.063 } 00:34:39.063 ], 00:34:39.063 "driver_specific": {} 00:34:39.063 } 00:34:39.063 ] 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:39.063 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:39.063 23:03:23 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:39.064 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:34:39.064 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:34:39.064 23:03:23 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c38003b5-47bc-5b57-9c40-d23a42f83411"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c38003b5-47bc-5b57-9c40-d23a42f83411",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "719f8f07-6088-5f65-b00b-1a50e948e4ba"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "719f8f07-6088-5f65-b00b-1a50e948e4ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ee5968c6-ba7c-5451-982b-8dfe8f384b27"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ee5968c6-ba7c-5451-982b-8dfe8f384b27",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "711d57d2-3a90-54d4-8116-5b79821e12e0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "711d57d2-3a90-54d4-8116-5b79821e12e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:39.323 23:03:24 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:34:39.323 23:03:24 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:34:39.323 23:03:24 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:34:39.323 23:03:24 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 2901679 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 2901679 ']' 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 2901679 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2901679 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2901679' 00:34:39.323 killing process with pid 2901679 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 2901679 00:34:39.323 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 2901679 00:34:39.892 23:03:24 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:39.892 23:03:24 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:39.892 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:39.892 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:39.892 23:03:24 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:39.892 ************************************ 00:34:39.892 START TEST bdev_hello_world 00:34:39.892 ************************************ 00:34:39.892 23:03:24 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:39.892 [2024-07-15 23:03:24.748903] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:39.892 [2024-07-15 23:03:24.748981] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2902211 ] 00:34:40.151 [2024-07-15 23:03:24.881448] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:40.151 [2024-07-15 23:03:24.982384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:40.151 [2024-07-15 23:03:25.003670] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:40.151 [2024-07-15 23:03:25.011715] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:40.151 [2024-07-15 23:03:25.019723] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:40.410 [2024-07-15 23:03:25.124952] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:42.942 [2024-07-15 23:03:27.336184] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:42.942 [2024-07-15 23:03:27.336264] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:42.942 [2024-07-15 23:03:27.336281] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:42.942 [2024-07-15 23:03:27.344201] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:42.942 [2024-07-15 23:03:27.344223] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:42.942 [2024-07-15 23:03:27.344236] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:42.942 [2024-07-15 23:03:27.352221] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:42.942 [2024-07-15 23:03:27.352241] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:42.942 [2024-07-15 23:03:27.352253] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:42.942 [2024-07-15 23:03:27.360242] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:42.942 [2024-07-15 23:03:27.360261] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:42.942 [2024-07-15 23:03:27.360273] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:42.942 [2024-07-15 23:03:27.433218] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:42.942 [2024-07-15 23:03:27.433264] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:42.942 [2024-07-15 23:03:27.433283] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:42.942 [2024-07-15 23:03:27.434552] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:42.942 [2024-07-15 23:03:27.434630] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:42.942 [2024-07-15 23:03:27.434648] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:42.942 [2024-07-15 23:03:27.434695] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:42.942 00:34:42.942 [2024-07-15 23:03:27.434715] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:42.942 00:34:42.942 real 0m3.120s 00:34:42.942 user 0m2.706s 00:34:42.942 sys 0m0.376s 00:34:42.942 23:03:27 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:42.942 23:03:27 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:42.942 ************************************ 00:34:42.942 END TEST bdev_hello_world 00:34:42.942 ************************************ 00:34:43.201 23:03:27 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:43.201 23:03:27 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:34:43.201 23:03:27 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:43.201 23:03:27 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:43.201 23:03:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:43.201 ************************************ 00:34:43.201 START TEST bdev_bounds 00:34:43.201 ************************************ 00:34:43.201 23:03:27 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:43.201 23:03:27 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2902588 00:34:43.201 23:03:27 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2902588' 00:34:43.202 Process bdevio pid: 2902588 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2902588 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2902588 ']' 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:43.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:43.202 23:03:27 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:43.202 [2024-07-15 23:03:27.955082] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:43.202 [2024-07-15 23:03:27.955157] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2902588 ] 00:34:43.202 [2024-07-15 23:03:28.087741] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:43.460 [2024-07-15 23:03:28.193320] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:43.460 [2024-07-15 23:03:28.193424] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:43.460 [2024-07-15 23:03:28.193424] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:43.460 [2024-07-15 23:03:28.214880] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:43.460 [2024-07-15 23:03:28.222904] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:43.460 [2024-07-15 23:03:28.230934] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:43.460 [2024-07-15 23:03:28.337696] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:45.993 [2024-07-15 23:03:30.550804] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:45.993 [2024-07-15 23:03:30.550898] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:45.993 [2024-07-15 23:03:30.550916] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.993 [2024-07-15 23:03:30.558824] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:45.993 [2024-07-15 23:03:30.558844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:45.993 [2024-07-15 23:03:30.558861] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.993 [2024-07-15 23:03:30.566846] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:45.993 [2024-07-15 23:03:30.566864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:45.993 [2024-07-15 23:03:30.566876] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.993 [2024-07-15 23:03:30.574873] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:45.993 [2024-07-15 23:03:30.574891] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:45.993 [2024-07-15 23:03:30.574902] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.993 23:03:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:45.993 23:03:30 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:45.993 23:03:30 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:45.993 I/O targets: 00:34:45.993 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:45.993 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:34:45.993 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:34:45.993 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:45.993 00:34:45.993 00:34:45.993 CUnit - A unit testing framework for C - Version 2.1-3 00:34:45.993 http://cunit.sourceforge.net/ 00:34:45.993 00:34:45.993 00:34:45.993 Suite: bdevio tests on: crypto_ram3 00:34:45.993 Test: blockdev write read block ...passed 00:34:45.993 Test: blockdev write zeroes read block ...passed 00:34:45.993 Test: blockdev write zeroes read no split ...passed 00:34:45.993 Test: blockdev write zeroes read split ...passed 00:34:45.993 Test: blockdev write zeroes read split partial ...passed 00:34:45.993 Test: blockdev reset ...passed 00:34:45.993 Test: blockdev write read 8 blocks ...passed 00:34:45.993 Test: blockdev write read size > 128k ...passed 00:34:45.993 Test: blockdev write read invalid size ...passed 00:34:45.993 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:45.993 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:45.993 Test: blockdev write read max offset ...passed 00:34:45.993 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:45.993 Test: blockdev writev readv 8 blocks ...passed 00:34:45.993 Test: blockdev writev readv 30 x 1block ...passed 00:34:45.993 Test: blockdev writev readv block ...passed 00:34:45.993 Test: blockdev writev readv size > 128k ...passed 00:34:45.993 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:45.993 Test: blockdev comparev and writev ...passed 00:34:45.993 Test: blockdev nvme passthru rw ...passed 00:34:45.993 Test: blockdev nvme passthru vendor specific ...passed 00:34:45.993 Test: blockdev nvme admin passthru ...passed 00:34:45.993 Test: blockdev copy ...passed 00:34:45.993 Suite: bdevio tests on: crypto_ram2 00:34:45.993 Test: blockdev write read block ...passed 00:34:45.993 Test: blockdev write zeroes read block ...passed 00:34:45.993 Test: blockdev write zeroes read no split ...passed 00:34:45.993 Test: blockdev write zeroes read split ...passed 00:34:45.993 Test: blockdev write zeroes read split partial ...passed 00:34:45.993 Test: blockdev reset ...passed 00:34:45.993 Test: blockdev write read 8 blocks ...passed 00:34:45.993 Test: blockdev write read size > 128k ...passed 00:34:45.993 Test: blockdev write read invalid size ...passed 00:34:45.993 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:45.993 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:45.993 Test: blockdev write read max offset ...passed 00:34:45.993 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:45.993 Test: blockdev writev readv 8 blocks ...passed 00:34:45.993 Test: blockdev writev readv 30 x 1block ...passed 00:34:45.993 Test: blockdev writev readv block ...passed 00:34:45.993 Test: blockdev writev readv size > 128k ...passed 00:34:45.993 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:45.993 Test: blockdev comparev and writev ...passed 00:34:45.993 Test: blockdev nvme passthru rw ...passed 00:34:45.993 Test: blockdev nvme passthru vendor specific ...passed 00:34:45.993 Test: blockdev nvme admin passthru ...passed 00:34:45.993 Test: blockdev copy ...passed 00:34:45.993 Suite: bdevio tests on: crypto_ram1 00:34:45.993 Test: blockdev write read block ...passed 00:34:45.993 Test: blockdev write zeroes read block ...passed 00:34:46.288 Test: blockdev write zeroes read no split ...passed 00:34:46.288 Test: blockdev write zeroes read split ...passed 00:34:46.547 Test: blockdev write zeroes read split partial ...passed 00:34:46.547 Test: blockdev reset ...passed 00:34:46.547 Test: blockdev write read 8 blocks ...passed 00:34:46.547 Test: blockdev write read size > 128k ...passed 00:34:46.547 Test: blockdev write read invalid size ...passed 00:34:46.547 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:46.547 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:46.547 Test: blockdev write read max offset ...passed 00:34:46.547 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:46.547 Test: blockdev writev readv 8 blocks ...passed 00:34:46.547 Test: blockdev writev readv 30 x 1block ...passed 00:34:46.547 Test: blockdev writev readv block ...passed 00:34:46.547 Test: blockdev writev readv size > 128k ...passed 00:34:46.547 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:46.547 Test: blockdev comparev and writev ...passed 00:34:46.547 Test: blockdev nvme passthru rw ...passed 00:34:46.547 Test: blockdev nvme passthru vendor specific ...passed 00:34:46.547 Test: blockdev nvme admin passthru ...passed 00:34:46.547 Test: blockdev copy ...passed 00:34:46.547 Suite: bdevio tests on: crypto_ram 00:34:46.547 Test: blockdev write read block ...passed 00:34:46.547 Test: blockdev write zeroes read block ...passed 00:34:46.547 Test: blockdev write zeroes read no split ...passed 00:34:46.547 Test: blockdev write zeroes read split ...passed 00:34:46.807 Test: blockdev write zeroes read split partial ...passed 00:34:46.807 Test: blockdev reset ...passed 00:34:46.807 Test: blockdev write read 8 blocks ...passed 00:34:46.807 Test: blockdev write read size > 128k ...passed 00:34:46.807 Test: blockdev write read invalid size ...passed 00:34:46.807 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:46.807 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:46.807 Test: blockdev write read max offset ...passed 00:34:46.807 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:46.807 Test: blockdev writev readv 8 blocks ...passed 00:34:46.807 Test: blockdev writev readv 30 x 1block ...passed 00:34:46.807 Test: blockdev writev readv block ...passed 00:34:46.807 Test: blockdev writev readv size > 128k ...passed 00:34:46.807 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:46.807 Test: blockdev comparev and writev ...passed 00:34:46.807 Test: blockdev nvme passthru rw ...passed 00:34:46.807 Test: blockdev nvme passthru vendor specific ...passed 00:34:46.807 Test: blockdev nvme admin passthru ...passed 00:34:46.807 Test: blockdev copy ...passed 00:34:46.807 00:34:46.807 Run Summary: Type Total Ran Passed Failed Inactive 00:34:46.807 suites 4 4 n/a 0 0 00:34:46.807 tests 92 92 92 0 0 00:34:46.807 asserts 520 520 520 0 n/a 00:34:46.807 00:34:46.807 Elapsed time = 1.574 seconds 00:34:46.807 0 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2902588 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2902588 ']' 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2902588 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2902588 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2902588' 00:34:46.807 killing process with pid 2902588 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2902588 00:34:46.807 23:03:31 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2902588 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:34:47.376 00:34:47.376 real 0m4.153s 00:34:47.376 user 0m11.032s 00:34:47.376 sys 0m0.593s 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:47.376 ************************************ 00:34:47.376 END TEST bdev_bounds 00:34:47.376 ************************************ 00:34:47.376 23:03:32 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:47.376 23:03:32 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:47.376 23:03:32 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:47.376 23:03:32 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:47.376 23:03:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:47.376 ************************************ 00:34:47.376 START TEST bdev_nbd 00:34:47.376 ************************************ 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2903147 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2903147 /var/tmp/spdk-nbd.sock 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2903147 ']' 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:47.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:47.376 23:03:32 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:47.376 [2024-07-15 23:03:32.201533] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:34:47.376 [2024-07-15 23:03:32.201601] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:47.671 [2024-07-15 23:03:32.324088] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:47.671 [2024-07-15 23:03:32.428667] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:47.671 [2024-07-15 23:03:32.449953] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:47.671 [2024-07-15 23:03:32.457977] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:47.671 [2024-07-15 23:03:32.465993] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:47.671 [2024-07-15 23:03:32.575249] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:50.203 [2024-07-15 23:03:34.794415] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:50.203 [2024-07-15 23:03:34.794493] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:50.203 [2024-07-15 23:03:34.794510] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.203 [2024-07-15 23:03:34.802433] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:50.203 [2024-07-15 23:03:34.802454] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:50.203 [2024-07-15 23:03:34.802466] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.203 [2024-07-15 23:03:34.810454] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:50.203 [2024-07-15 23:03:34.810471] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:50.203 [2024-07-15 23:03:34.810484] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.203 [2024-07-15 23:03:34.818473] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:50.203 [2024-07-15 23:03:34.818491] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:50.203 [2024-07-15 23:03:34.818502] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:50.203 23:03:34 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:50.462 1+0 records in 00:34:50.462 1+0 records out 00:34:50.462 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000190286 s, 21.5 MB/s 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:50.462 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:50.721 1+0 records in 00:34:50.721 1+0 records out 00:34:50.721 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311677 s, 13.1 MB/s 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:50.721 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:50.980 1+0 records in 00:34:50.980 1+0 records out 00:34:50.980 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329754 s, 12.4 MB/s 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:50.980 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:50.981 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:50.981 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:50.981 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:51.240 23:03:35 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:51.240 1+0 records in 00:34:51.240 1+0 records out 00:34:51.240 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000338919 s, 12.1 MB/s 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:51.240 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd0", 00:34:51.499 "bdev_name": "crypto_ram" 00:34:51.499 }, 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd1", 00:34:51.499 "bdev_name": "crypto_ram1" 00:34:51.499 }, 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd2", 00:34:51.499 "bdev_name": "crypto_ram2" 00:34:51.499 }, 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd3", 00:34:51.499 "bdev_name": "crypto_ram3" 00:34:51.499 } 00:34:51.499 ]' 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd0", 00:34:51.499 "bdev_name": "crypto_ram" 00:34:51.499 }, 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd1", 00:34:51.499 "bdev_name": "crypto_ram1" 00:34:51.499 }, 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd2", 00:34:51.499 "bdev_name": "crypto_ram2" 00:34:51.499 }, 00:34:51.499 { 00:34:51.499 "nbd_device": "/dev/nbd3", 00:34:51.499 "bdev_name": "crypto_ram3" 00:34:51.499 } 00:34:51.499 ]' 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:51.499 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:51.758 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:52.018 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:34:52.277 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:34:52.277 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:34:52.277 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:34:52.277 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:52.277 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:52.277 23:03:36 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:34:52.277 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:52.277 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:52.277 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:52.277 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:52.536 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:52.795 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:53.054 /dev/nbd0 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:53.054 1+0 records in 00:34:53.054 1+0 records out 00:34:53.054 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231375 s, 17.7 MB/s 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:53.054 23:03:37 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:34:53.313 /dev/nbd1 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:53.313 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:53.314 1+0 records in 00:34:53.314 1+0 records out 00:34:53.314 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000302923 s, 13.5 MB/s 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:53.314 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:34:53.573 /dev/nbd10 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:53.573 1+0 records in 00:34:53.573 1+0 records out 00:34:53.573 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289575 s, 14.1 MB/s 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:53.573 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:34:53.833 /dev/nbd11 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:53.833 1+0 records in 00:34:53.833 1+0 records out 00:34:53.833 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329317 s, 12.4 MB/s 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:53.833 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:54.092 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:54.092 { 00:34:54.092 "nbd_device": "/dev/nbd0", 00:34:54.093 "bdev_name": "crypto_ram" 00:34:54.093 }, 00:34:54.093 { 00:34:54.093 "nbd_device": "/dev/nbd1", 00:34:54.093 "bdev_name": "crypto_ram1" 00:34:54.093 }, 00:34:54.093 { 00:34:54.093 "nbd_device": "/dev/nbd10", 00:34:54.093 "bdev_name": "crypto_ram2" 00:34:54.093 }, 00:34:54.093 { 00:34:54.093 "nbd_device": "/dev/nbd11", 00:34:54.093 "bdev_name": "crypto_ram3" 00:34:54.093 } 00:34:54.093 ]' 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:54.093 { 00:34:54.093 "nbd_device": "/dev/nbd0", 00:34:54.093 "bdev_name": "crypto_ram" 00:34:54.093 }, 00:34:54.093 { 00:34:54.093 "nbd_device": "/dev/nbd1", 00:34:54.093 "bdev_name": "crypto_ram1" 00:34:54.093 }, 00:34:54.093 { 00:34:54.093 "nbd_device": "/dev/nbd10", 00:34:54.093 "bdev_name": "crypto_ram2" 00:34:54.093 }, 00:34:54.093 { 00:34:54.093 "nbd_device": "/dev/nbd11", 00:34:54.093 "bdev_name": "crypto_ram3" 00:34:54.093 } 00:34:54.093 ]' 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:54.093 /dev/nbd1 00:34:54.093 /dev/nbd10 00:34:54.093 /dev/nbd11' 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:54.093 /dev/nbd1 00:34:54.093 /dev/nbd10 00:34:54.093 /dev/nbd11' 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:54.093 256+0 records in 00:34:54.093 256+0 records out 00:34:54.093 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102103 s, 103 MB/s 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:54.093 23:03:38 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:54.352 256+0 records in 00:34:54.352 256+0 records out 00:34:54.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0748919 s, 14.0 MB/s 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:54.352 256+0 records in 00:34:54.352 256+0 records out 00:34:54.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0655337 s, 16.0 MB/s 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:34:54.352 256+0 records in 00:34:54.352 256+0 records out 00:34:54.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0376912 s, 27.8 MB/s 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:34:54.352 256+0 records in 00:34:54.352 256+0 records out 00:34:54.352 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0342877 s, 30.6 MB/s 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:54.352 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:54.610 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:54.868 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:54.868 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:54.868 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:54.868 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:54.868 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:54.868 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:55.126 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:55.126 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:55.126 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:55.126 23:03:39 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:55.385 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:55.644 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:55.644 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:55.644 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:55.903 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:55.903 malloc_lvol_verify 00:34:56.162 23:03:40 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:56.421 75eb8959-e628-4070-96f2-b02e5f8c85bd 00:34:56.421 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:56.681 97f9a601-4fac-4868-a2a7-c9470cb309e0 00:34:56.681 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:56.940 /dev/nbd0 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:56.940 mke2fs 1.46.5 (30-Dec-2021) 00:34:56.940 Discarding device blocks: 0/4096 done 00:34:56.940 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:56.940 00:34:56.940 Allocating group tables: 0/1 done 00:34:56.940 Writing inode tables: 0/1 done 00:34:56.940 Creating journal (1024 blocks): done 00:34:56.940 Writing superblocks and filesystem accounting information: 0/1 done 00:34:56.940 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:56.940 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2903147 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2903147 ']' 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2903147 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2903147 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2903147' 00:34:57.199 killing process with pid 2903147 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2903147 00:34:57.199 23:03:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2903147 00:34:57.768 23:03:42 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:34:57.768 00:34:57.768 real 0m10.298s 00:34:57.768 user 0m13.470s 00:34:57.768 sys 0m4.044s 00:34:57.768 23:03:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:57.768 23:03:42 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:57.768 ************************************ 00:34:57.768 END TEST bdev_nbd 00:34:57.768 ************************************ 00:34:57.768 23:03:42 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:57.768 23:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:34:57.768 23:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:34:57.768 23:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:34:57.768 23:03:42 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:34:57.768 23:03:42 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:57.768 23:03:42 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:57.768 23:03:42 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:57.768 ************************************ 00:34:57.768 START TEST bdev_fio 00:34:57.768 ************************************ 00:34:57.768 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:57.768 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:34:57.768 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:57.769 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:57.769 ************************************ 00:34:57.769 START TEST bdev_fio_rw_verify 00:34:57.769 ************************************ 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:57.769 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:58.028 23:03:42 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:58.286 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:58.286 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:58.286 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:58.286 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:58.286 fio-3.35 00:34:58.286 Starting 4 threads 00:35:13.209 00:35:13.209 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2905184: Mon Jul 15 23:03:55 2024 00:35:13.209 read: IOPS=18.5k, BW=72.1MiB/s (75.6MB/s)(721MiB/10001msec) 00:35:13.209 slat (usec): min=11, max=1404, avg=74.16, stdev=43.18 00:35:13.209 clat (usec): min=19, max=1882, avg=394.04, stdev=266.52 00:35:13.209 lat (usec): min=45, max=1903, avg=468.20, stdev=292.16 00:35:13.209 clat percentiles (usec): 00:35:13.209 | 50.000th=[ 318], 99.000th=[ 1156], 99.900th=[ 1336], 99.990th=[ 1467], 00:35:13.209 | 99.999th=[ 1680] 00:35:13.209 write: IOPS=20.3k, BW=79.5MiB/s (83.3MB/s)(774MiB/9744msec); 0 zone resets 00:35:13.209 slat (usec): min=18, max=538, avg=88.42, stdev=46.03 00:35:13.209 clat (usec): min=16, max=2159, avg=448.66, stdev=297.82 00:35:13.209 lat (usec): min=56, max=2445, avg=537.09, stdev=325.52 00:35:13.209 clat percentiles (usec): 00:35:13.209 | 50.000th=[ 359], 99.000th=[ 1237], 99.900th=[ 1418], 99.990th=[ 1598], 00:35:13.209 | 99.999th=[ 2073] 00:35:13.209 bw ( KiB/s): min=46960, max=97480, per=96.99%, avg=78913.05, stdev=5524.99, samples=76 00:35:13.209 iops : min=11740, max=24370, avg=19728.21, stdev=1381.26, samples=76 00:35:13.209 lat (usec) : 20=0.01%, 50=0.03%, 100=4.99%, 250=31.46%, 500=30.93% 00:35:13.209 lat (usec) : 750=17.16%, 1000=10.76% 00:35:13.209 lat (msec) : 2=4.68%, 4=0.01% 00:35:13.209 cpu : usr=99.53%, sys=0.00%, ctx=71, majf=0, minf=287 00:35:13.209 IO depths : 1=6.7%, 2=26.7%, 4=53.3%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:13.209 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:13.209 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:13.209 issued rwts: total=184558,198190,0,0 short=0,0,0,0 dropped=0,0,0,0 00:35:13.209 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:13.209 00:35:13.209 Run status group 0 (all jobs): 00:35:13.209 READ: bw=72.1MiB/s (75.6MB/s), 72.1MiB/s-72.1MiB/s (75.6MB/s-75.6MB/s), io=721MiB (756MB), run=10001-10001msec 00:35:13.209 WRITE: bw=79.5MiB/s (83.3MB/s), 79.5MiB/s-79.5MiB/s (83.3MB/s-83.3MB/s), io=774MiB (812MB), run=9744-9744msec 00:35:13.209 00:35:13.209 real 0m13.495s 00:35:13.209 user 0m46.071s 00:35:13.209 sys 0m0.500s 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:35:13.209 ************************************ 00:35:13.209 END TEST bdev_fio_rw_verify 00:35:13.209 ************************************ 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:35:13.209 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c38003b5-47bc-5b57-9c40-d23a42f83411"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c38003b5-47bc-5b57-9c40-d23a42f83411",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "719f8f07-6088-5f65-b00b-1a50e948e4ba"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "719f8f07-6088-5f65-b00b-1a50e948e4ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ee5968c6-ba7c-5451-982b-8dfe8f384b27"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ee5968c6-ba7c-5451-982b-8dfe8f384b27",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "711d57d2-3a90-54d4-8116-5b79821e12e0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "711d57d2-3a90-54d4-8116-5b79821e12e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:35:13.210 crypto_ram1 00:35:13.210 crypto_ram2 00:35:13.210 crypto_ram3 ]] 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "c38003b5-47bc-5b57-9c40-d23a42f83411"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "c38003b5-47bc-5b57-9c40-d23a42f83411",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "719f8f07-6088-5f65-b00b-1a50e948e4ba"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "719f8f07-6088-5f65-b00b-1a50e948e4ba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ee5968c6-ba7c-5451-982b-8dfe8f384b27"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ee5968c6-ba7c-5451-982b-8dfe8f384b27",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "711d57d2-3a90-54d4-8116-5b79821e12e0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "711d57d2-3a90-54d4-8116-5b79821e12e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:13.210 ************************************ 00:35:13.210 START TEST bdev_fio_trim 00:35:13.210 ************************************ 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:13.210 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:13.211 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:13.211 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:13.211 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:13.211 23:03:56 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:13.211 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:13.211 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:13.211 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:13.211 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:13.211 fio-3.35 00:35:13.211 Starting 4 threads 00:35:25.419 00:35:25.419 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2907030: Mon Jul 15 23:04:09 2024 00:35:25.419 write: IOPS=31.0k, BW=121MiB/s (127MB/s)(1211MiB/10001msec); 0 zone resets 00:35:25.419 slat (usec): min=11, max=1225, avg=74.29, stdev=52.74 00:35:25.419 clat (usec): min=29, max=1688, avg=272.22, stdev=206.22 00:35:25.419 lat (usec): min=48, max=1854, avg=346.51, stdev=247.99 00:35:25.419 clat percentiles (usec): 00:35:25.419 | 50.000th=[ 204], 99.000th=[ 938], 99.900th=[ 1090], 99.990th=[ 1156], 00:35:25.419 | 99.999th=[ 1172] 00:35:25.419 bw ( KiB/s): min=85280, max=199872, per=98.35%, avg=121965.68, stdev=10191.62, samples=76 00:35:25.419 iops : min=21320, max=49968, avg=30491.37, stdev=2547.91, samples=76 00:35:25.419 trim: IOPS=31.0k, BW=121MiB/s (127MB/s)(1211MiB/10001msec); 0 zone resets 00:35:25.419 slat (usec): min=4, max=462, avg=21.31, stdev=11.02 00:35:25.419 clat (usec): min=29, max=1854, avg=346.73, stdev=248.01 00:35:25.419 lat (usec): min=36, max=1883, avg=368.03, stdev=255.81 00:35:25.419 clat percentiles (usec): 00:35:25.419 | 50.000th=[ 262], 99.000th=[ 1123], 99.900th=[ 1303], 99.990th=[ 1369], 00:35:25.419 | 99.999th=[ 1385] 00:35:25.419 bw ( KiB/s): min=85280, max=199872, per=98.35%, avg=121965.68, stdev=10191.62, samples=76 00:35:25.419 iops : min=21320, max=49968, avg=30491.37, stdev=2547.91, samples=76 00:35:25.419 lat (usec) : 50=1.00%, 100=9.52%, 250=44.15%, 500=27.41%, 750=10.64% 00:35:25.419 lat (usec) : 1000=5.76% 00:35:25.419 lat (msec) : 2=1.51% 00:35:25.419 cpu : usr=99.54%, sys=0.00%, ctx=61, majf=0, minf=98 00:35:25.419 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:25.419 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:25.419 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:25.419 issued rwts: total=0,310056,310057,0 short=0,0,0,0 dropped=0,0,0,0 00:35:25.419 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:25.419 00:35:25.419 Run status group 0 (all jobs): 00:35:25.419 WRITE: bw=121MiB/s (127MB/s), 121MiB/s-121MiB/s (127MB/s-127MB/s), io=1211MiB (1270MB), run=10001-10001msec 00:35:25.419 TRIM: bw=121MiB/s (127MB/s), 121MiB/s-121MiB/s (127MB/s-127MB/s), io=1211MiB (1270MB), run=10001-10001msec 00:35:25.419 00:35:25.419 real 0m13.602s 00:35:25.419 user 0m46.505s 00:35:25.419 sys 0m0.531s 00:35:25.419 23:04:09 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:25.419 23:04:09 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:35:25.419 ************************************ 00:35:25.419 END TEST bdev_fio_trim 00:35:25.419 ************************************ 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:35:25.419 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:35:25.419 00:35:25.419 real 0m27.526s 00:35:25.419 user 1m32.837s 00:35:25.419 sys 0m1.220s 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:25.419 ************************************ 00:35:25.419 END TEST bdev_fio 00:35:25.419 ************************************ 00:35:25.419 23:04:10 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:25.419 23:04:10 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:25.419 23:04:10 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:25.419 23:04:10 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:25.419 23:04:10 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:25.419 23:04:10 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:25.419 ************************************ 00:35:25.419 START TEST bdev_verify 00:35:25.419 ************************************ 00:35:25.419 23:04:10 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:25.419 [2024-07-15 23:04:10.232219] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:35:25.419 [2024-07-15 23:04:10.232353] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2908301 ] 00:35:25.685 [2024-07-15 23:04:10.431002] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:25.685 [2024-07-15 23:04:10.534228] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:25.685 [2024-07-15 23:04:10.534233] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:25.685 [2024-07-15 23:04:10.555629] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:25.685 [2024-07-15 23:04:10.563654] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:25.685 [2024-07-15 23:04:10.571684] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:25.944 [2024-07-15 23:04:10.688944] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:28.481 [2024-07-15 23:04:12.897243] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:28.481 [2024-07-15 23:04:12.897334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:28.481 [2024-07-15 23:04:12.897349] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.481 [2024-07-15 23:04:12.905260] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:28.481 [2024-07-15 23:04:12.905282] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:28.481 [2024-07-15 23:04:12.905295] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.481 [2024-07-15 23:04:12.913281] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:28.481 [2024-07-15 23:04:12.913305] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:28.481 [2024-07-15 23:04:12.913317] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.481 [2024-07-15 23:04:12.921303] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:28.481 [2024-07-15 23:04:12.921323] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:28.481 [2024-07-15 23:04:12.921334] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:28.481 Running I/O for 5 seconds... 00:35:33.752 00:35:33.752 Latency(us) 00:35:33.752 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:33.752 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x0 length 0x1000 00:35:33.752 crypto_ram : 5.07 474.56 1.85 0.00 0.00 268697.45 4217.10 165036.74 00:35:33.752 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x1000 length 0x1000 00:35:33.752 crypto_ram : 5.07 380.26 1.49 0.00 0.00 334965.30 1132.63 205156.17 00:35:33.752 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x0 length 0x1000 00:35:33.752 crypto_ram1 : 5.08 477.41 1.86 0.00 0.00 266579.68 5841.25 151359.67 00:35:33.752 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x1000 length 0x1000 00:35:33.752 crypto_ram1 : 5.07 383.16 1.50 0.00 0.00 331504.72 940.30 186008.26 00:35:33.752 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x0 length 0x1000 00:35:33.752 crypto_ram2 : 5.06 3669.54 14.33 0.00 0.00 34575.38 6810.05 27582.11 00:35:33.752 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x1000 length 0x1000 00:35:33.752 crypto_ram2 : 5.06 2983.97 11.66 0.00 0.00 42447.08 5214.39 31913.18 00:35:33.752 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x0 length 0x1000 00:35:33.752 crypto_ram3 : 5.06 3668.16 14.33 0.00 0.00 34495.15 6154.69 27582.11 00:35:33.752 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:33.752 Verification LBA range: start 0x1000 length 0x1000 00:35:33.752 crypto_ram3 : 5.06 2982.80 11.65 0.00 0.00 42348.19 4900.95 31913.18 00:35:33.752 =================================================================================================================== 00:35:33.752 Total : 15019.86 58.67 0.00 0.00 67684.23 940.30 205156.17 00:35:33.752 00:35:33.752 real 0m8.411s 00:35:33.752 user 0m15.758s 00:35:33.752 sys 0m0.443s 00:35:33.752 23:04:18 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:33.752 23:04:18 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:33.752 ************************************ 00:35:33.752 END TEST bdev_verify 00:35:33.752 ************************************ 00:35:33.752 23:04:18 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:33.752 23:04:18 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:33.752 23:04:18 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:33.752 23:04:18 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:33.752 23:04:18 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:33.752 ************************************ 00:35:33.752 START TEST bdev_verify_big_io 00:35:33.752 ************************************ 00:35:33.752 23:04:18 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:34.011 [2024-07-15 23:04:18.690390] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:35:34.011 [2024-07-15 23:04:18.690457] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2909364 ] 00:35:34.011 [2024-07-15 23:04:18.820673] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:34.269 [2024-07-15 23:04:18.929051] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:34.269 [2024-07-15 23:04:18.929057] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:34.269 [2024-07-15 23:04:18.950427] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:34.269 [2024-07-15 23:04:18.958460] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:34.269 [2024-07-15 23:04:18.966487] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:34.269 [2024-07-15 23:04:19.074545] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:36.801 [2024-07-15 23:04:21.292874] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:36.801 [2024-07-15 23:04:21.292969] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:36.801 [2024-07-15 23:04:21.292984] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.801 [2024-07-15 23:04:21.300893] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:36.801 [2024-07-15 23:04:21.300914] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:36.801 [2024-07-15 23:04:21.300931] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.801 [2024-07-15 23:04:21.308917] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:36.801 [2024-07-15 23:04:21.308940] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:36.801 [2024-07-15 23:04:21.308952] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.801 [2024-07-15 23:04:21.316944] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:36.801 [2024-07-15 23:04:21.316962] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:36.801 [2024-07-15 23:04:21.316974] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:36.801 Running I/O for 5 seconds... 00:35:37.367 [2024-07-15 23:04:22.273144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.273696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.273790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.273853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.273909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.273970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.274514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.274537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.274635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.274697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.274755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.367 [2024-07-15 23:04:22.274811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.279650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.279713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.279766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.279820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.280322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.280346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.280419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.280475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.280528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.280591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.285334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.285404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.285456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.285511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.286039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.286064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.286129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.286183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.286237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.286291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.290995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.291924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.292014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.296689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.296759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.296812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.296867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.297429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.297453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.297516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.297571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.297625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.297685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.302486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.302548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.302601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.302655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.303182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.303206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.303270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.303325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.303380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.303435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.308942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.309001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.313694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.313773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.313826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.313879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.314416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.314440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.314509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.314563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.314617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.314670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.319153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.319213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.319267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.319331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.628 [2024-07-15 23:04:22.319789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.319812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.319882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.319947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.320001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.320058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.324589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.324650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.324703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.324767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.325339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.325363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.325426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.325485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.325539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.325598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.329981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.330769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.334992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.335045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.335098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.338415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.338478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.338530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.338583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.338923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.338952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.339020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.339073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.339125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.339198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.343999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.344051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.347490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.347549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.347602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.347655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.348204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.348228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.348295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.348352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.348407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.348461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.351637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.351696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.351760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.351813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.352159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.352182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.352258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.352311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.352365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.352417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.356405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.356472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.356529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.356582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.356923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.356950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.357019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.357077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.357130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.357186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.360845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.360930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.360984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.361037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.361576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.361599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.361663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.361717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.361770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.361824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.364811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.364874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.364931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.364985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.365326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.365348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.365417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.365470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.365521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.365580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.369435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.369494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.369546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.369599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.369944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.369967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.370041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.370099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.370160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.370213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.373654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.373713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.373766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.373820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.374394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.374418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.374483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.374536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.374590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.374649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.377658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.377725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.377781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.377834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.378181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.378204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.378272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.378325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.378378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.378429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.382972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.383025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.386391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.386451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.386505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.386558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.387124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.387148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.387217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.387271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.387330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.387384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.390341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.390399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.390451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.390503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.390839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.390861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.390935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.391004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.391058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.391113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.395009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.395069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.629 [2024-07-15 23:04:22.395131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.395183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.395523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.395545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.395619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.395681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.395734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.395786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.398779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.398838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.398892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.398953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.399436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.399458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.399519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.399572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.399625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.399679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.402678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.402743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.402808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.402861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.403207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.403231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.403304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.403357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.403410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.403462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.407979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.408033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.408086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.410992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.411939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.415836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.419258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.419331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.419384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.419443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.419783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.419805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.421764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.423760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.425103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.426870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.430742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.432516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.434502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.436491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.437002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.437025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.438799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.440777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.442774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.443276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.448470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.449695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.451470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.453471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.453821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.453844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.454362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.454859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.455357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.456684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.461254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.462306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.462807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.463312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.463845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.463869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.465816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.467824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.469825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.471006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.474610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.475561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.477333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.479320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.479667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.479689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.480889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.482645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.484624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.486564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.492057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.494006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.495572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.497329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.497678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.497701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.499652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.500158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.500652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.501163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.505817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.507822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.508566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.509068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.509610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.509635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.510146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.511903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.513861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.515854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.519200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.519699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.521130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.522896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.523249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.523272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.525224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.526812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.528602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.630 [2024-07-15 23:04:22.530667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.535885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.537989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.539434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.541322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.541672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.541695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.543731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.544961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.545454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.545955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.550464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.552461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.554459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.554966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.555505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.555528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.556040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.557042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.558794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.560799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.563957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.564455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.564956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.566891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.567247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.567271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.569280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.570697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.572569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.574561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.579840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.581855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.583849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.585016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.585420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.585443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.587475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.589464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.590311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.590805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.594843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.596611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.598600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.890 [2024-07-15 23:04:22.600552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.601044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.601073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.601582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.602107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.603344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.605097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.609036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.609554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.610053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.610550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.610899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.610923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.612863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.614385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.616193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.618138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.622248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.622752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.623253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.623752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.624260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.624297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.624829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.625329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.625825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.626323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.630395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.630896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.631394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.631893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.632343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.632367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.632882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.633383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.633882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.634386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.638384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.638881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.639382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.639884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.640382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.640406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.640933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.641429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.641933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.642450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.646530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.647033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.647531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.648040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.648553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.648576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.649094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.649590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.650092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.650592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.654626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.655131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.655630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.656157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.656689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.656712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.657230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.657728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.658229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.658731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.662786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.663290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.663791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.664298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.664849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.664873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.665382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.665879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.666386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.666888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.670896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.671419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.671919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.672427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.673030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.673055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.673559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.674058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.674556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.675065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.679182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.679683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.680186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.680689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.681249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.681274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.681778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.682285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.682803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.683305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.687487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.687991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.891 [2024-07-15 23:04:22.688492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.689000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.689554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.689581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.690092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.690591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.691115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.691613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.695864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.696368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.696868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.697377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.697942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.697967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.698476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.698980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.699488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.699992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.703939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.704438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.706286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.707256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.707677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.707700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.708216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.708717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.709217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.711071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.715509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.716378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.716872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.717373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.717896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.717919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.719686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.721687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.723679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.724865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.728205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.729321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.731095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.733091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.733441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.733464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.734775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.736534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.738540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.740386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.745573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.747572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.748989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.750746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.751101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.751125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.753075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.753575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.754084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.754580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.758885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.760898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.762254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.762760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.763283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.763307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.763811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.765747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.767750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.769745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.772689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.773195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.773806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.775566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.775914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.775945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.777960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.779149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.780914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.782906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.788080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.790059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.792054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.793251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.793638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.793661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.892 [2024-07-15 23:04:22.795661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.797648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.798389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.798442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.803577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.804783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.806546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.808537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.808886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.808910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.809418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.809917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.810420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.812018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.812082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.812444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.816411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.817805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.818316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.818815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.819400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.819425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.819495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.819555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.819607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.819662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.820058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.822972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.823452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.826344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.826410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.826463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.826517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.826861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.826885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.826962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.827017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.827069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.827123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.827469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.829572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.829632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.829689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.829743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.830322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.830348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.830413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.154 [2024-07-15 23:04:22.830468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.830522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.830579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.831147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.833329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.833388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.833447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.833503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.833912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.833942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.834011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.834065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.834125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.834199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.834547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.837238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.837298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.837352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.837408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.837973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.837997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.838066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.838120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.838175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.838229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.838618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.840705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.840764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.840824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.840881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.841230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.841255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.841324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.841378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.841431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.841483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.841953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.844757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.844827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.844881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.844942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.845285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.845309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.845380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.845435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.845503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.845557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.845963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.848971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.849048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.849657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.851796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.851864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.851919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.851980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.852325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.852348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.852417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.852478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.852538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.852591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.852948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.855547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.855609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.855662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.855716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.856200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.856224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.856291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.856345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.856398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.856452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.856869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.858872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.858949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.859656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.860241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.862948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.863007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.863060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.863113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.863450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.863472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.155 [2024-07-15 23:04:22.863542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.863609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.863661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.863712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.864126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.866222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.866280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.866332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.866384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.866933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.866954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.867017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.867070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.867123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.867175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.867748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.869842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.869905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.869967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.870985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.873634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.873693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.873745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.873799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.874290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.874320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.874387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.874441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.874500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.874553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.874923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.876936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.876999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.877707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.878265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.880997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.881778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.882252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.884343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.884403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.884460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.884513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.885062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.885087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.885151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.885209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.885265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.885318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.885867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.887944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.888741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.889090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.891717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.891779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.891833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.891887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.892323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.892348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.892414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.892467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.892520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.892579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.892935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.894998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.895058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.895110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.895164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.895500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.895523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.895594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.156 [2024-07-15 23:04:22.895658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.895713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.895768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.896299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.898952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.899720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.900241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.902323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.902383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.902435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.902489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.903030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.903056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.903120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.903178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.903241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.903295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.903855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.905938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.906770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.907120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.909846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.909906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.909967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.910021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.910408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.910430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.910496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.910560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.910615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.910675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.911025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.913862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.914422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.917805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.918328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.920521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.920582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.920635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.920689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.921214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.921239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.921302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.921357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.921411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.921465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.922040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.924921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.925275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.928719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.157 [2024-07-15 23:04:22.930725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.930788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.931138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.933242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.933303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.933364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.933419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.934000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.934024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.934088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.934143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.934198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.934693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.935239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.937288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.937355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.937413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.937469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.937863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.937889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.939909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.941913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.942631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.943133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.943691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.947029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.948781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.950783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.952773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.953299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.953324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.953835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.954338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.955088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.957118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.957465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.960156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.960656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.961155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.961655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.962201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.962226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.962733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.963255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.963754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.964265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.964856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.968187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.968685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.969193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.969720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.970306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.970331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.970837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.971339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.971840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.972358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.972921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.976177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.976677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.977185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.977701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.978296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.978321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.978824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.979326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.979825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.980328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.980915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.984213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.984714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.985221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.985724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.986269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.986293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.986800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.987308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.987806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.988315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.988877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.992165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.992667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.993173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.158 [2024-07-15 23:04:22.993676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:22.994238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:22.994264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:22.994770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:22.995270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:22.995769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:22.996277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:22.996844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.000224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.000723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.001239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.001742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.002290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.002315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.002817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.003323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.003823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.004335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.004866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.008250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.008765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.009270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.009776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.010370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.010402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.010906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.011408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.011913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.012436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.013000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.016488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.016993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.017492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.018004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.018621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.018646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.019159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.019659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.020170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.020668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.021219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.024759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.025265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.025764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.026271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.026825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.026850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.027360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.027856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.028370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.028868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.029441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.033090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.033590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.034100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.034603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.035175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.035200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.035704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.036215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.036730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.037235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.037770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.041230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.041728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.043111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.043806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.044165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.044189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.044700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.045200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.045695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.046200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.046664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.051228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.053257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.054711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.056634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.056986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.159 [2024-07-15 23:04:23.057011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.059030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.060358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.060859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.061361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.061913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.065774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.067788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.069796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.070622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.071193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.071218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.071727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.072227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.073996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.076001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.076348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.079317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.079817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.080318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.081402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.081818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.081841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.083869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.085876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.087107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.088862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.089220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.092657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.418 [2024-07-15 23:04:23.094419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.096422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.098424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.098912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.098941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.100693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.102710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.104705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.105226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.105803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.110410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.111638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.113404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.115402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.115750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.115773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.116289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.116788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.117294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.118718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.119141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.123153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.125104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.125604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.126103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.126676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.126700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.128056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.129804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.131779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.133625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.134070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.136827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.137336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.138947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.140704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.141057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.141081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.143041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.144818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.146784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.148788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.149201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.153844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.155854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.157818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.159577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.160010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.160035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.162054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.163876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.164381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.164881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.165461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.168935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.170691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.172637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.174290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.174833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.174858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.175371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.175868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.177631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.179554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.179904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.183458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.183966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.184468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.184972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.185325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.185347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.187274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.189274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.190689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.192554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.192900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.196153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.198050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.199998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.202015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.202415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.202438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.204393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.206382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.208402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.209495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.210097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.214597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.216413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.218258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.220212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.220558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.220581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.222298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.222796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.223299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.223795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.224150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.228152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.230163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.231239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.231736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.232243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.232268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.232780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.234720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.236727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.238725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.239210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.241752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.242257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.242982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.244743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.245098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.245122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.247147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.248329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.250086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.252088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.252434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.257323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.259323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.261327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.262514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.262921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.262951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.264968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.266962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.267697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.268198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.268784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.272074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.273829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.275802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.277795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.278337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.278362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.278873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.279383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.280311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.282078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.282426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.286449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.286961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.287457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.287959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.288390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.288413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.290175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.292166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.294107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.295727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.296121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.299035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.300289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.302027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.304012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.304359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.304382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.419 [2024-07-15 23:04:23.305770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.307544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.307606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.309617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.309983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.314617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.316601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.318536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.320127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.320512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.320535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.322550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.420 [2024-07-15 23:04:23.324496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.325005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.325066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.325620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.330169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.331469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.333229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.335229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.335579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.335602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.335677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.335734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.335788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.335850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.336440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.339800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.340312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.342442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.680 [2024-07-15 23:04:23.342500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.342553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.342610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.343169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.343193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.343275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.343341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.343394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.343447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.344007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.346847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.347190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.349858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.349917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.349977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.350036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.350448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.350472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.350538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.350592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.350651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.350706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.351051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.353877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.354395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.357836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.358316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.360384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.360443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.360496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.360549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.361106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.361130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.361191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.361245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.361306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.361359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.361921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.681 [2024-07-15 23:04:23.363988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.364759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.365107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.367765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.367825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.367878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.367938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.368414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.368436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.368502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.368556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.368609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.368667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.369078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.371164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.371233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.371287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.371342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.371856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.371880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.371950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.372009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.372063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.372118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.372586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.374851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.374909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.374966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.375019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.375503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.375527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.375599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.375665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.375726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.375780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.376160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.378559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.378620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.378676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.378731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.379237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.379260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.379341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.379396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.379449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.379503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.379959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.382901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.382967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.383836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.384340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.387194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.682 [2024-07-15 23:04:23.387253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.387323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.387377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.387954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.387979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.388043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.388098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.388152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.388205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.388748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.391664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.391723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.391776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.391847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.392467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.392490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.392555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.392609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.392667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.392720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.393203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.396982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.397037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.397092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.397600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.400433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.400491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.400565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.400619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.401193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.401217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.401279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.401333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.401387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.401440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.401988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.404912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.404981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.405923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.406391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.409351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.409410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.409475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.409542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.410103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.410126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.410202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.410267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.410325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.410378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.410892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.413786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.413845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.413904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.683 [2024-07-15 23:04:23.413964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.414538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.414561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.414622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.414678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.414731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.414784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.415328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.418250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.418309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.418380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.418436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.419014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.419039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.419106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.419162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.419215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.419269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.419712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.422620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.422711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.422764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.422834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.423435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.423458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.423532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.423587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.423640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.423693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.424255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.427150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.427226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.427279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.427333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.427893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.427917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.427988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.428044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.428107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.428161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.428763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.431615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.431691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.431754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.431807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.432367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.432391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.432455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.432507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.432562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.432614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.433178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.436989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.437044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.437098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.437642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.440544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.440605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.440657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.440710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.441273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.441306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.441369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.441423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.441476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.441547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.442156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.444971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.445046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.445100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.445154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.684 [2024-07-15 23:04:23.445724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.445748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.445812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.445866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.445931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.445987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.446473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.449521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.449592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.449660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.449731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.450221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.450245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.450320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.450376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.450431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.450493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.451032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.453909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.453975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.454890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.455486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.458337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.458419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.458473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.458526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.459085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.459109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.459173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.459230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.459728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.459800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.460335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.463274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.463344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.463396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.463450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.463951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.463975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.464050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.464106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.464160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.464650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.465187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.468096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.468157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.468212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.468266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.468800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.468823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.469341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.469845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.470354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.470851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.471346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.474661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.475175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.475695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.476200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.476747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.476771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.478196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.478876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.480511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.481017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.481560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.485185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.485686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.487318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.489126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.489475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.489498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.491402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.493205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.495151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.497160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.497580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.502312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.504255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.505951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.507884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.508243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.508267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.510262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.511801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.512305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.512807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.513358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.517115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.519069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.521067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.522419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.685 [2024-07-15 23:04:23.522971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.522995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.523503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.524006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.525897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.527842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.528197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.531444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.531961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.532459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.533072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.533455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.533479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.535503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.537515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.538711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.540466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.540817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.543838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.545688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.547679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.549685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.550165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.550189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.551956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.553967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.555964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.556766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.557347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.561832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.563046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.564770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.566755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.567113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.567137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.568037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.568541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.569047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.569951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.570331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.574376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.576365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.576875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.577383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.577978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.578003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.578773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.580522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.582513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.584502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.686 [2024-07-15 23:04:23.585047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.587768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.588289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.589618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.591371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.591719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.591742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.593759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.595237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.596986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.598923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.599278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.604030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.606022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.608023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.609256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.609642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.609666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.611685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.613698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.614208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.614707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.615287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.618683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.620442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.976 [2024-07-15 23:04:23.622410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.624291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.624820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.624844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.625360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.625870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.627212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.628968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.629319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.633282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.633788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.634292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.634794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.635193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.635216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.636985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.638991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.640797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.642651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.643024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.646142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.647723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.649485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.651480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.651833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.651856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.653589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.655468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.657481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.658898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.659471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.664002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.666009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.667517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.669265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.669612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.669634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.671584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.672090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.672592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.673089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.673513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.677440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.679453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.680909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.681436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.681946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.681970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.682477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.684232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.686131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.688148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.688528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.691033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.691536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.692040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.693981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.694328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.694350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.696387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.697795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.699628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.701590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.701942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.706706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.708659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.710652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.711898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.712249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.712272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.714298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.716292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.717389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.717886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.718405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.722250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.724194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.726196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.728193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.728607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.728630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.729150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.729650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.730152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.732100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.732447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.736544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.737443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.737946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.738443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.739000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.739029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.740796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.742798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.744802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.746007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.746395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.749292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.749965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.751735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.753702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.754055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.754079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.755270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.757030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.759007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.761004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.761543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.766274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.768214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.769403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.771153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.771502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.771524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.773540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.774305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.774802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.775302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.775841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.779582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.781582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.783583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.784096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.784644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.784670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.785185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.786004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.787756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.789741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.790093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.977 [2024-07-15 23:04:23.792735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.793244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.793744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.795236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.795617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.795640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.797637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.799588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.801180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.802919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.803273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.806996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.808749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.810748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.812698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.813142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.813165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.814932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.816920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.818860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.819368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.819915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.824601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.825961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.827705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.829703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.830058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.830081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.830590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.831095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.831596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.832790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.833144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.835912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.836419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.836917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.837935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.838343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.838365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.840414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.842409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.843241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.845004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.845351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.848504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.849020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.849523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.850028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.850578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.850601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.851125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.851632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.852138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.852641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.853219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.856623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.857131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.857633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.858141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.858651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.858674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.859190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.859691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.860215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.860711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.861245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.864695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.865213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.865712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.866217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.866723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.866747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.867260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.867765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.868281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.868779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.869347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.872690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.873218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.873724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.874225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.874759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.874782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.875302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.875802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.875862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.876368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.876934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.880269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:38.978 [2024-07-15 23:04:23.880773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.258 [2024-07-15 23:04:23.881281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.258 [2024-07-15 23:04:23.881786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.258 [2024-07-15 23:04:23.882352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.258 [2024-07-15 23:04:23.882376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.258 [2024-07-15 23:04:23.882883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.258 [2024-07-15 23:04:23.883393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.883895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.883965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.884510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.887995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.888498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.889001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.889509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.890023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.890046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.890121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.890199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.890266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.890322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.890736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.893696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.893757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.893810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.893875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.894385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.894409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.894495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.894565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.894620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.894674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.895031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.897852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.897911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.897970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.898026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.898491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.898513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.898584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.898652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.898731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.898786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.899191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.902178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.902238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.902291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.902345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.902894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.902917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.903013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.903069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.903157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.903222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.903669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.906433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.906498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.906557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.906611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.907103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.907126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.907198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.907264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.907332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.907419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.907902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.910693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.910754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.910807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.910861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.911310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.911333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.911407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.911474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.911556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.911643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.912125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.914756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.914817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.914870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.914933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.915373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.915396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.915469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.915535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.915611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.915689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.916192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.919942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.920418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.923996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.924512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.927285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.927345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.927398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.927451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.927891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.927915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.927994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.928061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.928141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.928228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.928694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.931319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.931378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.931431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.931485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.931951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.931974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.932046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.932118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.932172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.932244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.932756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.935448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.935509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.935562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.935616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.936135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.936158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.936232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.936315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.936384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.936457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.936981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.939577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.939638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.939691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.939745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.940242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.940272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.940345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.940402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.940466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.940535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.941072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.943745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.943806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.943860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.259 [2024-07-15 23:04:23.943914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.944361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.944383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.944456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.944522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.944581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.944650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.945103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.947682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.947742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.947797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.947853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.948295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.948317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.948388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.948442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.948495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.948548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.949132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.951828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.951886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.951944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.952002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.952553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.952576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.952649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.952704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.952763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.952817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.953296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.956212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.956271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.956324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.956379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.956833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.956858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.956939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.957006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.957080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.957156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.957710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.960418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.960481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.960534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.960587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.960934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.960956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.961026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.961089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.961141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.961192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.961614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.963692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.963762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.963814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.963866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.964390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.964412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.964473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.964526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.964578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.964640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.965160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.967340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.967406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.967460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.967513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.967860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.967880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.967951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.968012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.968085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.968141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.968486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.971955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.972012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.972420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.974561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.974619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.974671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.974733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.975080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.975102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.975169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.975222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.975274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.975325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.975853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.978774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.978844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.978903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.978965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.979312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.979333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.979401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.979454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.979505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.979556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.979898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.982950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.983008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.983060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.983515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.985826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.985886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.985944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.985997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.986455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.986475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.986563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.986619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.986670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.986721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.987117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.989526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.989585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.989638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.989704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.990300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.990326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.990389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.990442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.990497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.990550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.990944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.260 [2024-07-15 23:04:23.993790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.994134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.997985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:23.998325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.000416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.000476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.000529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.000580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.001082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.001104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.001172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.001238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.001293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.001346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.001886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.004398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.004456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.004513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.004565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.004903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.004923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.004997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.005050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.005108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.005163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.005560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.007763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.007823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.007877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.007937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.008430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.008451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.008523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.008576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.009077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.009134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.009480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.011613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.011671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.011724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.011776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.012120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.012141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.012210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.012263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.012335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.014015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.014527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.017158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.017216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.017268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.017327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.017669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.017689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.019450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.021388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.023329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.025325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.025739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.030464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.032469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.034415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.036196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.036591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.036611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.038632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.040437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.040940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.041437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.042008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.045550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.047300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.049241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.050900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.051406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.051428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.051941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.052441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.053859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.055631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.055980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.059680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.060187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.060685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.061185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.061541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.061562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.063341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.065351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.066880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.068819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.069170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.072366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.073835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.075582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.077524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.077868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.077889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.079476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.081234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.083261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.084785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.085341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.089898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.091847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.093456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.095222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.095568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.095593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.097539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.098044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.098536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.099031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.099411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.103393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.105403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.106681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.261 [2024-07-15 23:04:24.107182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.107686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.107707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.108216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.110153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.112161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.114151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.114565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.117168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.117668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.118413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.120167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.120510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.120530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.122546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.123749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.125518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.127515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.127862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.132655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.134650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.136639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.137835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.138235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.138256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.140256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.142243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.142856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.143353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.143906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.147281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.149053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.151053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.153004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.153553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.153574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.154083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.154577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.155733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.157497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.157841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.161876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.162387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.162881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.262 [2024-07-15 23:04:24.163380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.163759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.163781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.165523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.167461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.169162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.171101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.171448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.174646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.176320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.178116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.180132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.180478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.180499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.182232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.184130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.186149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.187531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.188060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.192578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.194288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.196227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.198171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.198518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.198539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.200163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.200661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.201159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.201652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.201999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.206094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.208106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.208983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.209475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.210005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.210027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.210529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.212285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.214279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.216281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.216818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.219456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.219959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.221056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.222810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.223161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.223183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.225208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.226452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.228200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.230181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.230528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.235404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.237404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.239401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.240610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.240990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.241012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.243034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.245022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.245526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.246023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.246604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.250622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.252519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.253595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.255030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.523 [2024-07-15 23:04:24.255565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.255589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.257186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.257694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.258940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.260694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.261049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.265093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.265596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.266094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.266589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.266945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.266969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.268776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.270799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.272249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.274027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.274409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.277410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.277913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.278420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.278914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.279449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.279470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.279981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.280483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.280987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.281482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.282034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.285383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.285886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.286392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.286890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.287430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.287452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.287960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.288467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.288974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.289470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.290007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.293243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.293746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.294256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.294757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.295291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.295314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.295817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.296320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.296821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.297339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.297832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.301257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.301763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.302272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.302772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.303305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.303330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.303835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.304340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.304840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.305356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.305839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.309173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.309674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.310191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.310694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.311231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.311255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.311755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.312264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.312768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.313269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.313774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.317172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.317672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.318172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.318676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.319201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.319224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.319730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.320241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.320757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.321260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.321790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.325192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.325692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.326197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.326703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.327252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.327275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.327778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.328282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.328792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.329294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.329841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.333337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.333832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.334334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.334831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.335394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.335417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.335919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.524 [2024-07-15 23:04:24.336419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.336917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.337425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.337967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.341428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.341950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.342448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.342955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.343555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.343581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.344089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.344583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.345086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.345592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.346127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.349453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.349963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.350459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.350963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.351483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.351504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.352014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.352514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.353024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.353524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.354063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.358181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.358682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.359181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.359679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.360261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.360283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.360790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.361294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.361788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.362291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.362782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.367599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.369590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.371581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.373018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.373411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.373432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.375449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.377444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.377950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.378443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.379002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.382557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.384308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.386256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.387935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.388435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.388456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.388971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.389465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.389523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.391407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.391763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.395792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.397019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.397516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.398018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.398554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.398576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.400525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.402564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.404572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.404632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.405084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.407715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.408225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.409331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.411109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.411453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.411474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.411545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.411598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.411656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.411711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.412064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.414229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.414288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.414341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.414398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.414917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.414946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.415009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.415063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.415115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.415182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.415760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.417939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.418005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.418058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.418110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.418451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.525 [2024-07-15 23:04:24.418472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.418538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.418597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.418654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.418706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.419053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.421653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.421711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.421764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.421817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.422334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.422355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.422420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.422472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.422523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.422575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.422984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.425819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.526 [2024-07-15 23:04:24.426315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.429828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.430219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.432403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.432464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.432516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.432568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.433107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.433129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.433189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.433243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.433314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.433372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.433924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.436861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.437208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.439902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.439970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.440027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.787 [2024-07-15 23:04:24.440082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.440508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.440528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.440592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.440645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.440696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.440755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.441102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.443982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.444515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.447907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.448430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.450623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.450681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.450737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.450791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.451337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.451360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.451433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.451487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.451543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.451597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.452132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.454283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.454347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.454399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.454451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.454852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.454872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.454948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.455001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.455053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.455104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.455445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.458970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.459026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.459078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.459420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.461556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.461615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.461688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.461742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.462103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.462125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.462193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.462247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.462319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.462372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.462912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.465456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.465514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.465572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.465623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.465969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.465990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.466058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.466111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.466176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.466231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.466628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.468988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.469919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.470455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.475436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.475500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.788 [2024-07-15 23:04:24.475560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.475615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.475965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.475987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.476054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.476108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.476170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.476224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.476775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.481588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.481654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.481706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.481758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.482138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.482159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.482228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.482280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.482332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.482384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.482724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.487836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.487899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.487968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.488024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.488365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.488387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.488456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.488508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.488565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.488616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.489063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.493828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.493892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.493956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.494011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.494423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.494444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.494507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.494560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.494620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.494678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.495027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.500196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.500261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.500315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.500385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.500975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.500998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.501060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.501114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.501166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.501219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.501736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.506411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.506473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.506525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.506578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.506934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.506956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.507027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.507089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.507142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.507194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.507601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.512499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.512562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.512614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.512666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.513119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.513140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.513216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.513268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.513325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.513376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.513753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.518881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.518950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.519657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.520002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.525795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.525859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.525938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.525991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.526558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.526579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.526641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.526694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.526747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.526799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.527168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.532882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.532953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.533006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.789 [2024-07-15 23:04:24.533062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.533594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.533615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.533683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.533736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.533788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.533841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.534380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.539905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.540255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.544362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.544426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.544479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.544541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.544878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.544899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.544976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.545030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.545082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.545137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.545487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.550612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.550683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.550735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.550786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.551179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.551201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.551270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.551322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.551374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.551426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.551762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.558905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.559688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.559746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.560145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.566935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.567434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.568002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.570170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.570229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.570280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.570332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.570701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.570721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.572740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.574687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.575193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.575686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.576278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.580076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.581942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.583957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.585418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.585967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.585989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.586492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.586990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.588867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.590814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.591162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.594417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.594919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.595416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.595911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.596261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.596283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.598311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.599624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.601544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.603539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.603968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.608711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.790 [2024-07-15 23:04:24.610731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.612230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.614175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.614518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.614538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.616559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.618038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.618543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.619038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.619572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.623602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.625682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.627679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.628767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.629305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.629329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.629833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.630332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.630829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.631331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.631878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.635447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.635951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.636442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.636948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.637502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.637527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.638042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.638539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.639045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.639540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.640096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.643616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.644121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.644621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.645129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.645674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.645697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.646207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.646709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.647215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.647711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.648225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.651641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.652149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.652656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.653154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.653704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.653726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.654233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.654735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.655239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.655733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.656296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.659792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.660297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.660804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.661307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.661843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.661864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.662374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.662868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.663377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.663869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.664440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.667994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.668496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.668994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.669511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.669997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.670019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.670519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.671018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.671518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.672017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.672580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.676032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.791 [2024-07-15 23:04:24.676534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.677038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.677534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.678045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.678067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.678568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.679077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.679586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.680085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.680684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.684142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.684645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.685146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.685642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.686219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.686241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.686746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.687261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.687755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.688251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.688816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:39.792 [2024-07-15 23:04:24.692253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.692759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.693262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.693759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.694306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.694328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.694832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.695339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.695834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.696329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.696862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.700356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.700860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.701361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.701853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.702415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.702438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.702948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.757341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.758985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.762000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.762371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.762424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.763850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.765131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.765407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.765487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.767325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.767384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.769219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.769276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.769876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.771483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.771757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.771774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.771788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.776177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.776575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.777666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.778954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.779226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.780857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.782478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.783509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.785096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.785432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.785449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.785463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.787386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.787781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.788181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.788572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.789015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.790599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.791893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.793517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.795123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.795397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.795413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.795428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.799597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.800002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.800392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.800780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.801225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.801784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.052 [2024-07-15 23:04:24.803485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.805186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.806814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.807093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.807110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.807124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.810422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.812069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.813163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.813554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.813987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.814388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.814778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.815170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.816844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.817181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.817199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.817213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.822687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.824363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.826232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.826629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.827123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.827523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.827911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.828300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.829209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.829482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.829499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.829514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.831524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.833190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.834916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.836535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.836810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.838332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.838724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.839118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.839509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.840014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.840032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.840047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.844934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.846385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.847682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.849328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.849603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.851301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.851859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.852253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.852641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.853044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.853062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.853077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.856625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.858263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.859863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.860885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.861167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.862821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.864455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.866093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.867086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.867489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.867506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.867521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.871919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.873778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.875645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.876224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.876498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.877793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.879418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.881049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.882578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.883009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.883026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.883041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.886523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.887815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.889627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.891494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.053 [2024-07-15 23:04:24.891767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.892849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.894441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.895702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.897319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.897593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.897610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.897624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.902750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.904491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.906177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.907789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.908066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.909703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.910702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.911997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.913805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.914084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.914102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.914116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.916453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.916846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.917245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.918941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.919284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.920937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.922758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.924623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.925205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.925482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.925499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.925513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.931030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.931426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.931814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.932873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.933155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.934828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.936456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.938086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.939101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.939378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.939394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.939409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.942221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.943250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.943641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.944038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.944312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.945102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.945493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.946326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.947733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.948013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.948030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.948049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.953908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.954710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.955108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.955500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.955878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.956283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.054 [2024-07-15 23:04:24.956671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.958388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.959663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.959942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.959959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.959974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.962115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.962505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.962898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.963292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.963740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.964149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.964545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.964947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.965338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.965775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.965793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.965809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.318 [2024-07-15 23:04:24.969262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.969659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.970051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.970445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.970794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.971212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.971607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.972001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.972391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.972917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.972943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.972959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.975530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.975933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.976321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.976713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.977115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.977522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.977919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.978319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.978712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.979107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.979125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.979139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.982554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.982973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.983361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.983755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.984197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.984613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.985017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.985407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.985796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.986234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.986254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.986269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.988825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.989227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.989618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.990015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.990362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.990769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.991176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.991569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.991965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.992442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.992459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.992475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.996113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.996511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.996900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.997306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.997659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.998067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.998459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.998846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.999241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.999688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.999706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:24.999721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.002320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.002715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.003110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.003503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.003975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.004382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.004787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.005180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.005567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.006011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.006029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.006045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.009443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.009840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.010234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.010631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.011110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.011518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.011906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.012302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.012691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.013109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.013127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.013142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.015834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.016232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.016291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.016677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.017082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.017472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.017488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.017893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.018287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.018333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.018714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.019109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.019548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.019570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.019586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.019601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.319 [2024-07-15 23:04:25.022954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.023008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.023400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.023448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.023888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.023905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.024314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.024365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.024752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.024795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.025232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.025250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.025266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.025282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.027975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.028030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.028420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.028463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.028901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.028917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.029321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.029364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.029752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.029795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.030201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.030218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.030234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.030252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.033694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.033774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.034170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.034219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.034641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.034658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.035984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.038699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.038748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.039143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.039188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.039649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.039666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.040953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.044575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.044656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.045065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.045123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.045604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.045621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.046031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.046079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.046467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.046511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.046979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.046996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.047011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.047027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.049733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.049784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.050175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.050220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.050616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.050633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.051039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.051084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.051475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.051538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.052008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.052025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.052040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.052054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.055741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.055798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.056204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.056252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.056683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.056704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.057110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.057156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.057543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.057585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.058041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.058060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.058075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.058091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.320 [2024-07-15 23:04:25.060615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.060665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.060707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.060749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.061155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.061172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.061569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.061613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.061654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.061697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.062146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.062164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.062179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.062194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.065961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.066004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.066445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.066462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.066478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.066492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.068997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.069419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.069437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.069452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.069467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.072967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.073979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.075817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.075863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.075913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.075963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.076401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.076419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.076469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.076511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.076557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.076599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.077082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.077100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.077115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.077130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.080999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.081731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.082008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.082025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.082044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.082058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.083568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.083613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.083661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.083704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.084785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.088863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.088918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.321 [2024-07-15 23:04:25.088965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.089790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.091478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.091528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.091572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.091613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.091882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.091898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.091961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.092004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.092047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.092088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.092447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.092464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.092479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.092494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.095962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.096003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.096043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.096313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.096330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.096344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.096358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.098953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.103541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.103592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.103635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.103677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.104589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.106775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.107055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.107072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.107086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.107101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.111963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.322 [2024-07-15 23:04:25.112396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.112413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.112427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.112441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.113979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.114988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.119670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.119721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.119764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.119806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.120983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.122896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.122948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.122996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.123846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.127494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.127544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.127592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.127635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.128718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.130941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.130987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.131904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.136560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.136609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.136650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.136691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.136966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.136983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.137650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.140399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.323 [2024-07-15 23:04:25.140446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.140491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.140532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.140833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.140849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.140904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.140950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.140994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.141034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.141353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.141369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.141384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.141398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.145788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.146062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.146079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.146094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.146108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.148360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.148406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.148448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.148488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.148964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.148983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.149616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.154465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.154515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.154557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.154598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.154916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.154945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.155443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.157310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.157706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.157750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.157793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.158226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.158243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.158293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.158680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.158724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.158767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.159089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.159105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.159120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.159134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.163699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.164979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.165055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.166626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.166902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.166919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.166988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.168853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.168905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.169300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.169774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.169793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.169809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.169824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.172217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.173468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.173541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.175140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.175414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.175430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.175483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.177204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.177251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.177797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.178078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.178095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.178109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.178123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.183368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.324 [2024-07-15 23:04:25.183764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.183810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.184202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.184649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.184669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.184722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.186179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.186224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.187495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.187773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.187790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.187804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.187818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.193023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.194654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.194702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.196175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.196567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.196584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.196635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.197030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.197075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.197464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.197969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.197987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.198004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.198019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.201592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.202419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.202484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.204345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.204683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.204700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.204758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.206130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.206178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.207510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.207908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.207929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.207948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.207963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.210789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.212664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.212723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.214516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.214916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.214937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.215001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.216658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.216704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.218291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.218567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.218584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.218598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.325 [2024-07-15 23:04:25.218612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.222795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.223196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.223243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.224869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.225165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.225181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.225240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.226861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.226908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.228531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.228808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.228824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.228838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.228853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.232996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.233398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.233788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.234182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.234696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.234714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.234769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.235162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.236712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.237991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.238269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.238286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.238300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.238314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.245286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.247152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.247548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.247943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.248387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.248405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.248802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.249196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.250164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.251446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.251722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.251739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.251753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.251767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.257820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.259455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.260358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.260750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.261210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.261229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.261627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.262021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.262410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.264274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.264629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.264645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.264660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.264674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.270030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.271660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.273225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.273616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.274089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.274107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.274503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.274908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.275302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.276599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.276956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.276973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.276988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.277002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.282918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.284792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.286585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.287047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.287575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.287593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.288002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.288391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.288780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.289540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.289815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.289832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.289846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.289860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.296318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.297962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.299586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.300596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.300986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.301003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.301405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.301794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.302191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.302579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.587 [2024-07-15 23:04:25.302857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.302874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.302888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.302903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.308799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.310441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.312074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.313726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.314134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.314154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.314558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.314954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.315350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.315743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.316095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.316112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.316126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.316141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.321948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.323634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.325505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.327361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.327822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.327840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.328273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.328672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.329066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.329457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.329881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.329898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.329913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.329934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.336267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.338146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.339803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.341430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.341763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.341781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.342195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.342591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.342987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.343377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.343831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.343854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.343870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.343886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.350942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.352374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.354005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.355633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.355912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.355935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.356340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.356725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.357117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.357502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.357964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.357982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.357997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.358012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.364037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.365349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.366982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.368610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.368888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.368906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.369495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.369885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.370278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.370667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.371097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.371115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.371131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.371148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.375794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.377335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.379206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.380866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.381149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.381166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.382447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.382838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.383233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.383622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.384113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.384131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.384147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.384163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.388570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.390374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.391641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.393259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.393538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.393554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.395431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.395831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.396226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.396614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.397089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.588 [2024-07-15 23:04:25.397107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.397121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.397136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.402420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.403605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.404900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.406527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.406806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.406823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.408446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.409292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.409678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.410070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.410458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.410475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.410491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.410506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.415475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.416001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.417858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.419417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.419697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.419713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.421361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.422953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.423342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.423732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.424187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.424205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.424221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.424236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.429319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.430396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.431433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.433059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.433343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.433363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.435227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.435655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.436047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.436431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.436840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.436857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.436872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.436886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.442217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.443103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.444481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.444880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.445408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.445427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.445826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.446223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.446611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.447354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.447629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.447646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.447660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.447675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.452268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.452665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.453062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.453458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.453851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.453868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.454275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.454665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.455061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.455450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.455854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.455873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.455887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.455901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.459553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.459966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.460363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.460757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.461207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.461228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.461626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.462018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.462407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.462802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.463272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.463290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.463305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.463320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.466812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.467216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.467606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.468001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.468490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.468508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.468906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.469303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.469696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.470091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.470591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.470609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.470624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.589 [2024-07-15 23:04:25.470639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.474161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.474561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.474951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.475339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.475753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.475770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.476181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.476582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.476979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.477368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.477747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.477764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.477778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.477793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.481211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.481607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.482001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.482394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.482774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.482790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.483200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.483585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.483977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.484367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.484822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.484839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.484855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.484875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.488376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.488774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.489184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.489582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.490027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.490045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.490440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.490826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.491219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.491614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.491973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.491991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.492006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.590 [2024-07-15 23:04:25.492020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.496020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.496417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.496977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.498664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.498947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.498965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.499671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.500089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.500483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.500867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.501282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.501301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.501315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.501329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.504758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.504835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.505230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.505618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.505975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.505993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.506398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.506457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.506846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.507246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.507680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.507698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.507713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.507730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.511263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.511318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.511707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.511751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.512207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.512226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.512630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.512691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.513092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.513150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.513614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.513630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.513645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.513660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.517025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.517097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.517485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.517528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.517968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.517986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.518384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.518427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.518815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.518858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.519316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.519334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.853 [2024-07-15 23:04:25.519348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.519363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.522871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.522945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.523343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.523394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.523780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.523797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.524210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.524260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.524647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.524691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.525111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.525129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.525144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.525159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.528805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.528860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.529255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.529302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.529700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.529717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.530143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.530219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.530616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.530665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.531177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.531196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.531211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.531227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.534842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.534897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.535290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.535334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.535773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.535790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.536193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.536249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.536642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.536691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.537114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.537132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.537147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.537161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.540873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.540938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.541338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.541386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.541889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.541907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.542315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.542361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.542748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.542798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.543241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.543260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.543275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.543290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.546781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.546835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.547232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.547307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.547789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.547806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.548220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.548290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.548678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.548722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.549163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.549182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.549198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.549213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.552430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.552482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.552525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.552569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.553008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.553027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.553426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.553469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.553511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.854 [2024-07-15 23:04:25.553552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.553990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.554008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.554030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.554045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.557985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.558026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.558067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.558427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.558443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.558459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.558474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.563875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.564336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.564354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.564369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.564385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.568505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.568555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.568597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.568638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.569566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.574388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.574440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.574486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.574528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.574995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.575642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.581969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.582239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.582256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.582271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.582286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.586680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.586731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.586773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.586814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.587303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.587320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.587371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.855 [2024-07-15 23:04:25.587414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.587456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.587498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.587935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.587953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.587968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.587983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.591649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.591698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.591740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.591780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.592718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.597881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.597936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.597982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.598999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.601565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.601622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.601664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.601705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.601984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.602512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.607992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.608034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.608438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.608455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.608470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.608484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.611903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.612189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.612206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.612221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.612235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.616830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.856 [2024-07-15 23:04:25.616881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.616934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.616976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.617916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.621679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.621731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.621773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.621813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.622623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.628999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.629016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.629032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.629046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.632997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.633014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.633028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.633042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.637659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.637711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.637760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.637828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.638756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.644526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.644581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.644623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.644666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.644984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.857 [2024-07-15 23:04:25.645535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.645550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.649879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.650158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.650176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.650190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.650204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.654751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.655028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.655046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.655060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.655075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.660971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.661243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.661261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.661276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.661291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.664893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.664954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.664998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.665996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.666010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.670899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.670966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.671010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.671058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.858 [2024-07-15 23:04:25.671331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.671858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.676896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.677223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.677241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.677256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.677270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.681189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.682826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.682876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.682917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.683197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.683213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.683272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.684935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.684983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.685025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.685382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.685399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.685420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.685434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.691047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.692405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.692454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.693720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.694003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.694021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.694085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.695850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.695899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.697691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.698156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.698173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.698188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.698202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.703086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.703485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.703532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.703916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.704406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.704424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.704476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.704865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.704911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.706733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.707034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.707051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.707066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.707080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.711253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.712887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.712946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.714562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.714895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.714912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.714980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.716493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.716538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.717121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.717565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.717585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.717601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.717616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.859 [2024-07-15 23:04:25.720653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.722285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.722334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.723954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.724227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.724244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.724296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.724772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.724820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.726261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.726592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.726609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.726624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.726639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.729751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.731285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.731335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.732332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.732606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.732623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.732681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.734498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.734549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.736381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.736658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.736675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.736689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.736704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.739473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.739876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.739921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.740316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.740610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.740627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.740681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.742251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.742300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.743915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.744195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.744212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.744226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.744240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.749234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.750802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.750851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.751245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.751705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.751728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.751783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.752181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.752225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.752608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.753058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.753077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.753093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:40.860 [2024-07-15 23:04:25.753108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.758328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.760173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.761749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.763381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.763658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.763675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.763735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.765124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.766234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.767389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.767868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.767885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.767901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.767916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.771882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.773505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.775122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.776341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.776686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.776703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.778001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.779846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.781726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.783449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.783833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.783850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.783864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.783879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.788054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.789678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.791299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.792681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.793034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.793051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.794467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.796337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.798146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.799772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.800206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.800224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.800238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.800254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.805205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.806577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.807839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.809451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.809725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.809743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.121 [2024-07-15 23:04:25.811389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.812087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.813960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.815310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.815584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.815604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.815619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.815633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.820156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.821207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.822491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.824248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.824524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.824541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.826151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.827145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.828775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.830060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.830332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.830349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.830364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.830378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.837049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.838321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.838714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.839109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.839384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.839403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.840687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.842321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.843945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.845634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.846024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.846042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.846056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.846075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.851237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.851636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.852032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.852422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.852712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.852729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.854008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.855714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.857580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.859436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.859942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.859961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.859976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.859990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.866858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.867269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.867661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.869134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.869455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.869472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.869876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.870277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.872124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.873679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.873965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.873982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.873996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.874010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.880247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.881575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.881977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.882366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.882787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.882805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.883213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.883605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.885192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.886469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.886744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.886762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.886776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.886790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.893630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.895396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.896137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.897658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.898174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.898192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.898590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.899917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.900849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.901246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.901687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.901706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.901720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.901734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.908411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.910191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.911802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.913439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.913796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.913815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.914232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.122 [2024-07-15 23:04:25.914622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.915019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.915408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.915844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.915861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.915877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.915894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.922991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.924387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.926015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.927654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.927950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.927970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.928760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.930228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.930619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.931012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.931285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.931302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.931316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.931331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.934782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.936418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.938052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.939283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.939630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.939646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.940934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.942742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.944594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.946281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.946672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.946690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.946705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.946720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.950083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.951585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.953450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.955149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.955425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.955441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.956710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.958081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.959360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.960985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.961259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.961276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.961290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.961305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.963414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.963873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.965684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.966080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.966519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.966539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.967674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.968959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.970627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.972504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.972782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.972799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.972813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.972827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.976418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.976817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.977211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.977602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.978055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.978073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.978468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.979334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.980716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.982588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.982866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.982882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.982897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.982911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.986427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.988317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.988712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.990579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.991067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.991085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.991488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.992364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.993739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.994138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.994594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.994612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.994632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.994648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.997803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.998452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.998850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.999247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.999692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:25.999712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:26.000117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:26.000504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:26.000894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.123 [2024-07-15 23:04:26.001299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.001644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.001661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.001677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.001692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.005221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.006124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.006516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.006910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.007411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.007430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.007835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.008237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.008626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.009022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.009492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.009510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.009526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.009541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.013075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.013475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.013864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.015713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.016143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.016160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.016562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.016961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.017358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.017753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.018242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.018260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.018275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.018289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.021129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.021533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.023091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.023778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.024227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.024248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.024643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.026515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.124 [2024-07-15 23:04:26.026913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.027308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.027670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.027687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.027702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.027717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.030526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.030921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.031326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.031725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.032222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.032239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.034036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.034505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.034891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.035854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.036142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.036160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.036174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.036189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.038973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.039364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.039748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.040145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.040586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.040604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.041015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.041412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.042015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.043670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.044183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.044201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.044216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.044231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.046724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.047131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.047529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.047917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.048374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.048392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.048791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.049187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.049572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.049970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.050355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.050372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.050386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.050401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.053724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.053776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.054785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.055185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.055621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.055638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.056046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.056106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.056502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.056894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.057352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.057370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.057386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.057402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.060390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.060443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.061779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.061824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.062156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.062173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.063588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.063634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.064035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.064080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.064534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.064551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.064565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.064580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.068504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.068556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.069242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.069288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.069561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.069577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.071103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.071151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.072773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.072819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.073245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.073264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.073279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.073294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.076484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.076536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.077989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.078038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.078310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.078327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.079953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.080000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.080677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.080738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.081094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.081116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.081131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.081145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.085010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.085063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.388 [2024-07-15 23:04:26.085459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.085504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.085965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.085983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.086385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.086433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.086825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.086878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.087272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.087289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.087304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.087318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.090428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.090480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.090872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.090919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.091315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.091332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.093033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.093079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.093466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.093509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.093953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.093972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.093987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.094010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.096506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.096571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.096967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.097015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.097503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.097520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.097915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.097966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.098353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.098396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.098840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.098858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.098874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.098889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.102141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.102193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.102581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.102624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.103048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.103065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.104938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.104988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.105377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.105421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.105876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.105895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.105910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.105930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.108583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.108643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.108690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.108731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.109228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.109246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.109642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.109687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.109728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.109788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.110155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.110173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.110188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.110203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.112993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.113035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.113308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.113324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.113338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.113353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.115816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.115862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.115906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.115956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.116366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.116387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.116438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.116480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.116521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.116562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.117020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.117039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.117054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.117071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.119391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.119437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.119493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.119536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.119955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.119972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.120498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.122653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.122698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.122743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.122783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.123233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.123251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.123305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.123352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.123398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.123441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.389 [2024-07-15 23:04:26.123834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.123851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.123865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.123880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.126995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.127038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.127083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.127522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.127539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.127554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.127569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.129629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.129684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.129730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.129772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.130944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.133914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.134302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.134319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.134334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.134348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.137954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.138338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.138355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.138373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.138388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.140587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.140633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.140674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.140715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.140989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.141686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.143955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.144721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.145164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.145181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.145196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.145212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.147660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.147716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.147756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.147797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.148787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.150905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.150958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.151713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.152145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.152162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.152177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.152191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.154597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.154643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.154684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.154747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.155230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.390 [2024-07-15 23:04:26.155247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.155910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.157715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.157760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.157801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.157852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.158848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.161827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.162208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.162225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.162240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.162254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.163955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.164868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.166903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.166957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.167941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.170985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.172681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.172726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.172778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.172824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.173622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.175771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.175818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.175864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.175907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.176216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.176233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.176285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.391 [2024-07-15 23:04:26.176327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.176371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.176412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.176766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.176783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.176798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.176813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.178794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.178840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.178882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.178923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.179715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.181475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.181522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.181563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.181604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.181873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.181889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.181969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.182015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.182055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.182095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.182361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.182377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.182391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.182405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.184832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.184879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.184921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.184984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.185993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.187894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.189525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.189577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.189618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.189888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.189905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.189968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.190805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.190852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.190897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.191173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.191190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.191204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.191218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.192789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.193328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.193378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.195235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.195731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.195748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.195810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.196202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.196247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.197597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.197896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.197913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.197933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.197948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.200015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.201598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.201644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.203262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.203535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.203558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.203618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.204560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.204605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.205882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.206161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.206178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.206192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.206206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.207877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.209116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.209161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.209548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.209989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.210007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.210078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.211945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.211997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.212388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.212824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.212841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.212857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.212872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.214464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.216090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.216138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.216944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.217221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.217237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.217291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.218581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.218627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.220234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.220506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.220522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.220537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.220552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.222385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.222781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.222825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.223978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.224251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.392 [2024-07-15 23:04:26.224268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.224327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.224718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.224762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.225199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.225471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.225488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.225502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.225516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.227092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.227647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.227693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.229435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.229710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.229727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.229782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.231397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.231443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.233057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.233332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.233350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.233364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.233380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.235628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.237208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.237253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.237769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.238206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.238224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.238276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.239160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.239206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.240480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.240753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.240769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.240784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.240798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.242464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.243753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.245443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.247305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.247580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.247597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.247654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.248401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.250118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.250673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.251114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.251132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.251151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.251166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.254635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.255922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.257548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.259341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.259615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.259631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.260460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.262249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.263530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.265162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.265442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.265458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.265472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.265487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.267692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.268612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.269967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.270356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.270795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.270812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.272379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.273675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.275301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.276916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.277192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.277209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.277223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.277237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.280559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.282387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.282996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.284646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.285149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.285167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.285564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.286712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.287813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.288207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.288658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.288677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.288692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.393 [2024-07-15 23:04:26.288706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.291951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.292433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.294248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.295825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.296104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.296121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.297764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.299326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.300210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.301586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.302074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.302091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.302106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.302122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.304633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.306502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.308030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.309640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.309916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.309937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.311623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.312557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.313869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.315732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.316011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.316028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.316043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.316057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.318050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.318441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.319997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.320698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.321145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.321163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.321730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.323425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.325125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.326748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.327028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.327045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.327059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.327074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.330374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.332014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.333027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.334453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.334761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.334778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.335194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.335584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.337446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.337839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.338281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.338301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.338316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.338331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.341542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.342934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.344174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.345460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.345732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.345748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.346158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.347979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.348431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.348822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.349183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.349200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.654 [2024-07-15 23:04:26.349215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.349229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.352556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.354240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.354635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.356112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.356440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.356456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.357339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.358716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.359112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.359504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.359775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.359791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.359805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.359820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.363053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.364200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.365781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.367407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.367682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.367698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.369336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.369929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.371794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.372222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.372675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.372693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.372709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.372724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.376233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.377818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.379434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.381064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.381338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.381354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.381857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.383722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.385260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.386869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.387150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.387171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.387185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.387199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.389469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.390731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.391725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.392118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.392568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.392585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.394457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.395864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.397487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.399102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.399373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.399390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.399405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.399422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.402637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.404065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.405073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.406330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.406800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.406818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.407222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.408822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.409487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.409878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.410273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.410291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.655 [2024-07-15 23:04:26.410305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.410320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.413620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.414465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.415874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.417741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.418021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.418038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.419676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.420827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.422115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.423090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.423553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.423571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.423587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.423602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.426600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.427963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.429828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.431686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.431964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.431981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.433149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.434612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.435892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.437513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.437785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.437802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.437816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.437831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.440038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.440605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.442305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.442696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.443146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.443164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.444428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.445715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.447338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.449085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.449359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.449375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.449389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:41.656 [2024-07-15 23:04:26.449403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:43.034 00:35:43.034 Latency(us) 00:35:43.034 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:43.034 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x0 length 0x100 00:35:43.034 crypto_ram : 6.10 41.96 2.62 0.00 0.00 2961967.19 322779.05 2450932.42 00:35:43.034 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x100 length 0x100 00:35:43.034 crypto_ram : 6.24 35.72 2.23 0.00 0.00 3213984.59 182361.04 2815654.51 00:35:43.034 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x0 length 0x100 00:35:43.034 crypto_ram1 : 6.10 41.95 2.62 0.00 0.00 2864446.78 322779.05 2261276.94 00:35:43.034 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x100 length 0x100 00:35:43.034 crypto_ram1 : 6.31 40.41 2.53 0.00 0.00 2838249.40 148624.25 2567643.49 00:35:43.034 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x0 length 0x100 00:35:43.034 crypto_ram2 : 5.61 263.33 16.46 0.00 0.00 434572.74 54708.31 605438.66 00:35:43.034 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x100 length 0x100 00:35:43.034 crypto_ram2 : 5.89 230.99 14.44 0.00 0.00 470361.59 15956.59 630969.21 00:35:43.034 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x0 length 0x100 00:35:43.034 crypto_ram3 : 5.69 271.19 16.95 0.00 0.00 409047.13 10542.75 470491.49 00:35:43.034 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:43.034 Verification LBA range: start 0x100 length 0x100 00:35:43.034 crypto_ram3 : 5.39 204.45 12.78 0.00 0.00 593953.73 43538.70 1174405.12 00:35:43.034 =================================================================================================================== 00:35:43.034 Total : 1129.98 70.62 0.00 0.00 849679.73 10542.75 2815654.51 00:35:43.293 00:35:43.293 real 0m9.557s 00:35:43.293 user 0m18.097s 00:35:43.293 sys 0m0.491s 00:35:43.293 23:04:28 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:43.293 23:04:28 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:43.293 ************************************ 00:35:43.293 END TEST bdev_verify_big_io 00:35:43.293 ************************************ 00:35:43.552 23:04:28 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:43.552 23:04:28 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:43.552 23:04:28 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:43.552 23:04:28 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:43.552 23:04:28 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:43.552 ************************************ 00:35:43.552 START TEST bdev_write_zeroes 00:35:43.552 ************************************ 00:35:43.552 23:04:28 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:43.552 [2024-07-15 23:04:28.335031] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:35:43.552 [2024-07-15 23:04:28.335096] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2910596 ] 00:35:43.810 [2024-07-15 23:04:28.465649] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:43.810 [2024-07-15 23:04:28.564553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:43.810 [2024-07-15 23:04:28.585851] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:43.810 [2024-07-15 23:04:28.593879] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:43.810 [2024-07-15 23:04:28.601899] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:43.810 [2024-07-15 23:04:28.707877] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:46.342 [2024-07-15 23:04:30.921571] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:46.342 [2024-07-15 23:04:30.921640] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:46.342 [2024-07-15 23:04:30.921656] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:46.342 [2024-07-15 23:04:30.929589] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:46.342 [2024-07-15 23:04:30.929608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:46.342 [2024-07-15 23:04:30.929620] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:46.342 [2024-07-15 23:04:30.937610] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:46.342 [2024-07-15 23:04:30.937628] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:46.342 [2024-07-15 23:04:30.937639] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:46.342 [2024-07-15 23:04:30.945643] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:46.342 [2024-07-15 23:04:30.945661] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:46.342 [2024-07-15 23:04:30.945672] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:46.342 Running I/O for 1 seconds... 00:35:47.279 00:35:47.279 Latency(us) 00:35:47.279 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:47.279 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:47.279 crypto_ram : 1.03 2003.86 7.83 0.00 0.00 63410.06 5641.79 76591.64 00:35:47.279 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:47.279 crypto_ram1 : 1.03 2009.39 7.85 0.00 0.00 62875.05 5613.30 71120.81 00:35:47.279 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:47.280 crypto_ram2 : 1.02 15426.34 60.26 0.00 0.00 8173.14 2478.97 10770.70 00:35:47.280 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:47.280 crypto_ram3 : 1.02 15458.47 60.38 0.00 0.00 8129.77 2464.72 8491.19 00:35:47.280 =================================================================================================================== 00:35:47.280 Total : 34898.06 136.32 0.00 0.00 14502.65 2464.72 76591.64 00:35:47.848 00:35:47.848 real 0m4.187s 00:35:47.848 user 0m3.743s 00:35:47.848 sys 0m0.398s 00:35:47.848 23:04:32 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:47.848 23:04:32 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:47.848 ************************************ 00:35:47.848 END TEST bdev_write_zeroes 00:35:47.848 ************************************ 00:35:47.848 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:47.848 23:04:32 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:47.848 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:47.848 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:47.848 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:47.848 ************************************ 00:35:47.848 START TEST bdev_json_nonenclosed 00:35:47.848 ************************************ 00:35:47.848 23:04:32 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:47.848 [2024-07-15 23:04:32.600654] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:35:47.848 [2024-07-15 23:04:32.600716] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2911135 ] 00:35:47.848 [2024-07-15 23:04:32.727532] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:48.108 [2024-07-15 23:04:32.825204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:48.108 [2024-07-15 23:04:32.825275] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:48.108 [2024-07-15 23:04:32.825297] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:48.108 [2024-07-15 23:04:32.825309] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:48.108 00:35:48.108 real 0m0.390s 00:35:48.108 user 0m0.243s 00:35:48.108 sys 0m0.144s 00:35:48.108 23:04:32 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:48.108 23:04:32 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:48.108 23:04:32 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:48.108 ************************************ 00:35:48.108 END TEST bdev_json_nonenclosed 00:35:48.108 ************************************ 00:35:48.108 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:48.108 23:04:32 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:35:48.108 23:04:32 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:48.108 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:48.108 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:48.108 23:04:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:48.367 ************************************ 00:35:48.367 START TEST bdev_json_nonarray 00:35:48.367 ************************************ 00:35:48.367 23:04:33 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:48.367 [2024-07-15 23:04:33.079387] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:35:48.367 [2024-07-15 23:04:33.079458] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2911313 ] 00:35:48.367 [2024-07-15 23:04:33.211816] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:48.626 [2024-07-15 23:04:33.316360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:48.626 [2024-07-15 23:04:33.316444] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:48.626 [2024-07-15 23:04:33.316465] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:48.626 [2024-07-15 23:04:33.316478] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:48.626 00:35:48.626 real 0m0.411s 00:35:48.626 user 0m0.235s 00:35:48.626 sys 0m0.172s 00:35:48.627 23:04:33 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:48.627 23:04:33 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:48.627 23:04:33 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:48.627 ************************************ 00:35:48.627 END TEST bdev_json_nonarray 00:35:48.627 ************************************ 00:35:48.627 23:04:33 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:35:48.627 23:04:33 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:35:48.627 00:35:48.627 real 1m12.871s 00:35:48.627 user 2m42.402s 00:35:48.627 sys 0m9.260s 00:35:48.627 23:04:33 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:48.627 23:04:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:48.627 ************************************ 00:35:48.627 END TEST blockdev_crypto_qat 00:35:48.627 ************************************ 00:35:48.627 23:04:33 -- common/autotest_common.sh@1142 -- # return 0 00:35:48.627 23:04:33 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:48.627 23:04:33 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:48.627 23:04:33 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:48.627 23:04:33 -- common/autotest_common.sh@10 -- # set +x 00:35:48.885 ************************************ 00:35:48.885 START TEST chaining 00:35:48.885 ************************************ 00:35:48.885 23:04:33 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:48.885 * Looking for test storage... 00:35:48.885 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:48.885 23:04:33 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@7 -- # uname -s 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:35:48.885 23:04:33 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:48.885 23:04:33 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:35:48.885 23:04:33 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:48.885 23:04:33 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:48.885 23:04:33 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:48.885 23:04:33 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:48.885 23:04:33 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:48.885 23:04:33 chaining -- paths/export.sh@5 -- # export PATH 00:35:48.886 23:04:33 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@47 -- # : 0 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:35:48.886 23:04:33 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:35:48.886 23:04:33 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:35:48.886 23:04:33 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:35:48.886 23:04:33 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:35:48.886 23:04:33 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:35:48.886 23:04:33 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:48.886 23:04:33 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:48.886 23:04:33 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:48.886 23:04:33 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:48.886 23:04:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@336 -- # return 1 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:57.007 WARNING: No supported devices were found, fallback requested for tcp test 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:57.007 Cannot find device "nvmf_tgt_br" 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@155 -- # true 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:57.007 Cannot find device "nvmf_tgt_br2" 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@156 -- # true 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:57.007 Cannot find device "nvmf_tgt_br" 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@158 -- # true 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:57.007 Cannot find device "nvmf_tgt_br2" 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@159 -- # true 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:57.007 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@162 -- # true 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:57.007 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@163 -- # true 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:57.007 23:04:41 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:57.266 23:04:41 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:57.266 23:04:42 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:57.266 23:04:42 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:57.266 23:04:42 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:57.266 23:04:42 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:57.266 23:04:42 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:57.266 23:04:42 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:57.545 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:57.545 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.098 ms 00:35:57.545 00:35:57.545 --- 10.0.0.2 ping statistics --- 00:35:57.545 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:57.545 rtt min/avg/max/mdev = 0.098/0.098/0.098/0.000 ms 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:57.545 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:57.545 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.075 ms 00:35:57.545 00:35:57.545 --- 10.0.0.3 ping statistics --- 00:35:57.545 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:57.545 rtt min/avg/max/mdev = 0.075/0.075/0.075/0.000 ms 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:57.545 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:57.545 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.039 ms 00:35:57.545 00:35:57.545 --- 10.0.0.1 ping statistics --- 00:35:57.545 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:57.545 rtt min/avg/max/mdev = 0.039/0.039/0.039/0.000 ms 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@433 -- # return 0 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:57.545 23:04:42 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:57.805 23:04:42 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:35:57.805 23:04:42 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:57.805 23:04:42 chaining -- nvmf/common.sh@481 -- # nvmfpid=2915013 00:35:57.805 23:04:42 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:57.805 23:04:42 chaining -- nvmf/common.sh@482 -- # waitforlisten 2915013 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@829 -- # '[' -z 2915013 ']' 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:57.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:57.805 23:04:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:57.805 [2024-07-15 23:04:42.548159] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:35:57.805 [2024-07-15 23:04:42.548232] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:57.805 [2024-07-15 23:04:42.690121] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:58.100 [2024-07-15 23:04:42.811891] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:58.100 [2024-07-15 23:04:42.811955] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:58.100 [2024-07-15 23:04:42.811975] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:58.100 [2024-07-15 23:04:42.811992] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:58.100 [2024-07-15 23:04:42.812005] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:58.100 [2024-07-15 23:04:42.812040] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:58.667 23:04:43 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:58.667 23:04:43 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:58.667 23:04:43 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:58.667 23:04:43 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:58.667 23:04:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:58.667 23:04:43 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:58.667 23:04:43 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:58.667 23:04:43 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.V5M4PtrmcU 00:35:58.667 23:04:43 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:58.667 23:04:43 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.MXKcb2FbLy 00:35:58.667 23:04:43 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:58.667 23:04:43 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:35:58.667 23:04:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:58.667 23:04:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:58.667 malloc0 00:35:58.667 true 00:35:58.925 true 00:35:58.925 [2024-07-15 23:04:43.580376] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:58.925 crypto0 00:35:58.925 [2024-07-15 23:04:43.588406] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:58.925 crypto1 00:35:58.925 [2024-07-15 23:04:43.596559] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:58.925 [2024-07-15 23:04:43.612824] tcp.c: 981:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@85 -- # update_stats 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:58.925 23:04:43 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.V5M4PtrmcU bs=1K count=64 00:35:58.925 64+0 records in 00:35:58.925 64+0 records out 00:35:58.925 65536 bytes (66 kB, 64 KiB) copied, 0.00105868 s, 61.9 MB/s 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.V5M4PtrmcU --ob Nvme0n1 --bs 65536 --count 1 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@25 -- # local config 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:58.925 23:04:43 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:58.925 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:59.191 23:04:43 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:59.191 "subsystems": [ 00:35:59.191 { 00:35:59.191 "subsystem": "bdev", 00:35:59.191 "config": [ 00:35:59.191 { 00:35:59.191 "method": "bdev_nvme_attach_controller", 00:35:59.191 "params": { 00:35:59.191 "trtype": "tcp", 00:35:59.191 "adrfam": "IPv4", 00:35:59.191 "name": "Nvme0", 00:35:59.191 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:59.191 "traddr": "10.0.0.2", 00:35:59.191 "trsvcid": "4420" 00:35:59.191 } 00:35:59.191 }, 00:35:59.191 { 00:35:59.191 "method": "bdev_set_options", 00:35:59.191 "params": { 00:35:59.191 "bdev_auto_examine": false 00:35:59.191 } 00:35:59.191 } 00:35:59.191 ] 00:35:59.191 } 00:35:59.191 ] 00:35:59.191 }' 00:35:59.191 23:04:43 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.V5M4PtrmcU --ob Nvme0n1 --bs 65536 --count 1 00:35:59.191 23:04:43 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:59.191 "subsystems": [ 00:35:59.191 { 00:35:59.191 "subsystem": "bdev", 00:35:59.191 "config": [ 00:35:59.191 { 00:35:59.191 "method": "bdev_nvme_attach_controller", 00:35:59.191 "params": { 00:35:59.191 "trtype": "tcp", 00:35:59.191 "adrfam": "IPv4", 00:35:59.191 "name": "Nvme0", 00:35:59.191 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:59.191 "traddr": "10.0.0.2", 00:35:59.191 "trsvcid": "4420" 00:35:59.191 } 00:35:59.191 }, 00:35:59.191 { 00:35:59.191 "method": "bdev_set_options", 00:35:59.191 "params": { 00:35:59.191 "bdev_auto_examine": false 00:35:59.191 } 00:35:59.191 } 00:35:59.191 ] 00:35:59.191 } 00:35:59.191 ] 00:35:59.191 }' 00:35:59.191 [2024-07-15 23:04:43.927192] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:35:59.191 [2024-07-15 23:04:43.927258] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2915235 ] 00:35:59.191 [2024-07-15 23:04:44.058202] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:59.455 [2024-07-15 23:04:44.158325] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:59.714  Copying: 64/64 [kB] (average 12 MBps) 00:35:59.714 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:59.714 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:59.714 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:59.714 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:59.714 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@96 -- # update_stats 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:59.972 23:04:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:59.972 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:00.231 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:00.231 23:04:44 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:00.231 23:04:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:00.231 23:04:44 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.MXKcb2FbLy --ib Nvme0n1 --bs 65536 --count 1 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@25 -- # local config 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:00.231 23:04:44 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:00.231 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:00.231 23:04:45 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:00.231 "subsystems": [ 00:36:00.231 { 00:36:00.231 "subsystem": "bdev", 00:36:00.231 "config": [ 00:36:00.231 { 00:36:00.231 "method": "bdev_nvme_attach_controller", 00:36:00.231 "params": { 00:36:00.231 "trtype": "tcp", 00:36:00.231 "adrfam": "IPv4", 00:36:00.231 "name": "Nvme0", 00:36:00.231 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:00.231 "traddr": "10.0.0.2", 00:36:00.231 "trsvcid": "4420" 00:36:00.231 } 00:36:00.231 }, 00:36:00.231 { 00:36:00.231 "method": "bdev_set_options", 00:36:00.231 "params": { 00:36:00.231 "bdev_auto_examine": false 00:36:00.231 } 00:36:00.231 } 00:36:00.231 ] 00:36:00.231 } 00:36:00.231 ] 00:36:00.231 }' 00:36:00.231 23:04:45 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.MXKcb2FbLy --ib Nvme0n1 --bs 65536 --count 1 00:36:00.231 23:04:45 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:00.231 "subsystems": [ 00:36:00.231 { 00:36:00.231 "subsystem": "bdev", 00:36:00.231 "config": [ 00:36:00.231 { 00:36:00.231 "method": "bdev_nvme_attach_controller", 00:36:00.231 "params": { 00:36:00.231 "trtype": "tcp", 00:36:00.231 "adrfam": "IPv4", 00:36:00.231 "name": "Nvme0", 00:36:00.231 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:00.231 "traddr": "10.0.0.2", 00:36:00.231 "trsvcid": "4420" 00:36:00.231 } 00:36:00.231 }, 00:36:00.231 { 00:36:00.231 "method": "bdev_set_options", 00:36:00.231 "params": { 00:36:00.231 "bdev_auto_examine": false 00:36:00.231 } 00:36:00.231 } 00:36:00.231 ] 00:36:00.231 } 00:36:00.231 ] 00:36:00.231 }' 00:36:00.231 [2024-07-15 23:04:45.064974] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:00.231 [2024-07-15 23:04:45.065045] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2915442 ] 00:36:00.490 [2024-07-15 23:04:45.196606] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:00.490 [2024-07-15 23:04:45.300401] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:01.009  Copying: 64/64 [kB] (average 15 MBps) 00:36:01.009 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:01.009 23:04:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:01.009 23:04:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:01.268 23:04:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:01.268 23:04:45 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:36:01.268 23:04:45 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.V5M4PtrmcU /tmp/tmp.MXKcb2FbLy 00:36:01.268 23:04:45 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:01.268 23:04:45 chaining -- bdev/chaining.sh@25 -- # local config 00:36:01.268 23:04:45 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:01.268 23:04:45 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:01.268 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:01.268 23:04:46 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:01.268 "subsystems": [ 00:36:01.268 { 00:36:01.268 "subsystem": "bdev", 00:36:01.268 "config": [ 00:36:01.268 { 00:36:01.268 "method": "bdev_nvme_attach_controller", 00:36:01.268 "params": { 00:36:01.268 "trtype": "tcp", 00:36:01.268 "adrfam": "IPv4", 00:36:01.268 "name": "Nvme0", 00:36:01.268 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:01.268 "traddr": "10.0.0.2", 00:36:01.268 "trsvcid": "4420" 00:36:01.268 } 00:36:01.268 }, 00:36:01.268 { 00:36:01.268 "method": "bdev_set_options", 00:36:01.268 "params": { 00:36:01.268 "bdev_auto_examine": false 00:36:01.268 } 00:36:01.268 } 00:36:01.268 ] 00:36:01.268 } 00:36:01.268 ] 00:36:01.268 }' 00:36:01.268 23:04:46 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:36:01.268 23:04:46 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:01.268 "subsystems": [ 00:36:01.268 { 00:36:01.268 "subsystem": "bdev", 00:36:01.268 "config": [ 00:36:01.268 { 00:36:01.268 "method": "bdev_nvme_attach_controller", 00:36:01.268 "params": { 00:36:01.268 "trtype": "tcp", 00:36:01.268 "adrfam": "IPv4", 00:36:01.268 "name": "Nvme0", 00:36:01.268 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:01.268 "traddr": "10.0.0.2", 00:36:01.268 "trsvcid": "4420" 00:36:01.268 } 00:36:01.268 }, 00:36:01.268 { 00:36:01.268 "method": "bdev_set_options", 00:36:01.268 "params": { 00:36:01.268 "bdev_auto_examine": false 00:36:01.268 } 00:36:01.268 } 00:36:01.268 ] 00:36:01.268 } 00:36:01.268 ] 00:36:01.268 }' 00:36:01.268 [2024-07-15 23:04:46.069489] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:01.268 [2024-07-15 23:04:46.069560] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2915639 ] 00:36:01.526 [2024-07-15 23:04:46.201665] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:01.526 [2024-07-15 23:04:46.308254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:02.044  Copying: 64/64 [kB] (average 10 MBps) 00:36:02.044 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@106 -- # update_stats 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:02.044 23:04:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.V5M4PtrmcU --ob Nvme0n1 --bs 4096 --count 16 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@25 -- # local config 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:02.044 23:04:46 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:02.044 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:02.303 23:04:46 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:02.303 "subsystems": [ 00:36:02.303 { 00:36:02.303 "subsystem": "bdev", 00:36:02.303 "config": [ 00:36:02.303 { 00:36:02.303 "method": "bdev_nvme_attach_controller", 00:36:02.303 "params": { 00:36:02.303 "trtype": "tcp", 00:36:02.303 "adrfam": "IPv4", 00:36:02.303 "name": "Nvme0", 00:36:02.303 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:02.303 "traddr": "10.0.0.2", 00:36:02.303 "trsvcid": "4420" 00:36:02.303 } 00:36:02.303 }, 00:36:02.303 { 00:36:02.303 "method": "bdev_set_options", 00:36:02.303 "params": { 00:36:02.303 "bdev_auto_examine": false 00:36:02.303 } 00:36:02.303 } 00:36:02.303 ] 00:36:02.303 } 00:36:02.303 ] 00:36:02.303 }' 00:36:02.303 23:04:46 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:02.303 "subsystems": [ 00:36:02.303 { 00:36:02.303 "subsystem": "bdev", 00:36:02.303 "config": [ 00:36:02.303 { 00:36:02.303 "method": "bdev_nvme_attach_controller", 00:36:02.303 "params": { 00:36:02.303 "trtype": "tcp", 00:36:02.303 "adrfam": "IPv4", 00:36:02.303 "name": "Nvme0", 00:36:02.303 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:02.303 "traddr": "10.0.0.2", 00:36:02.303 "trsvcid": "4420" 00:36:02.303 } 00:36:02.303 }, 00:36:02.303 { 00:36:02.303 "method": "bdev_set_options", 00:36:02.303 "params": { 00:36:02.303 "bdev_auto_examine": false 00:36:02.303 } 00:36:02.303 } 00:36:02.303 ] 00:36:02.303 } 00:36:02.303 ] 00:36:02.303 }' 00:36:02.303 23:04:46 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.V5M4PtrmcU --ob Nvme0n1 --bs 4096 --count 16 00:36:02.303 [2024-07-15 23:04:47.047357] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:02.303 [2024-07-15 23:04:47.047431] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2915784 ] 00:36:02.303 [2024-07-15 23:04:47.174805] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:02.562 [2024-07-15 23:04:47.281907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:02.821  Copying: 64/64 [kB] (average 9142 kBps) 00:36:02.821 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:02.821 23:04:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:02.821 23:04:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:02.821 23:04:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@114 -- # update_stats 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:03.081 23:04:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.081 23:04:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:03.340 23:04:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:03.340 23:04:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.340 23:04:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:03.340 23:04:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:03.340 23:04:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:03.340 23:04:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:03.341 23:04:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@117 -- # : 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.MXKcb2FbLy --ib Nvme0n1 --bs 4096 --count 16 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@25 -- # local config 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:36:03.341 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@31 -- # config='{ 00:36:03.341 "subsystems": [ 00:36:03.341 { 00:36:03.341 "subsystem": "bdev", 00:36:03.341 "config": [ 00:36:03.341 { 00:36:03.341 "method": "bdev_nvme_attach_controller", 00:36:03.341 "params": { 00:36:03.341 "trtype": "tcp", 00:36:03.341 "adrfam": "IPv4", 00:36:03.341 "name": "Nvme0", 00:36:03.341 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:03.341 "traddr": "10.0.0.2", 00:36:03.341 "trsvcid": "4420" 00:36:03.341 } 00:36:03.341 }, 00:36:03.341 { 00:36:03.341 "method": "bdev_set_options", 00:36:03.341 "params": { 00:36:03.341 "bdev_auto_examine": false 00:36:03.341 } 00:36:03.341 } 00:36:03.341 ] 00:36:03.341 } 00:36:03.341 ] 00:36:03.341 }' 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.MXKcb2FbLy --ib Nvme0n1 --bs 4096 --count 16 00:36:03.341 23:04:48 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:36:03.341 "subsystems": [ 00:36:03.341 { 00:36:03.341 "subsystem": "bdev", 00:36:03.341 "config": [ 00:36:03.341 { 00:36:03.341 "method": "bdev_nvme_attach_controller", 00:36:03.341 "params": { 00:36:03.341 "trtype": "tcp", 00:36:03.341 "adrfam": "IPv4", 00:36:03.341 "name": "Nvme0", 00:36:03.341 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:36:03.341 "traddr": "10.0.0.2", 00:36:03.341 "trsvcid": "4420" 00:36:03.341 } 00:36:03.341 }, 00:36:03.341 { 00:36:03.341 "method": "bdev_set_options", 00:36:03.341 "params": { 00:36:03.341 "bdev_auto_examine": false 00:36:03.341 } 00:36:03.341 } 00:36:03.341 ] 00:36:03.341 } 00:36:03.341 ] 00:36:03.341 }' 00:36:03.341 [2024-07-15 23:04:48.225027] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:03.341 [2024-07-15 23:04:48.225097] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2915897 ] 00:36:03.601 [2024-07-15 23:04:48.356791] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:03.601 [2024-07-15 23:04:48.458810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:04.120  Copying: 64/64 [kB] (average 1361 kBps) 00:36:04.120 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:04.120 23:04:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:04.120 23:04:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:04.120 23:04:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:04.120 23:04:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:04.120 23:04:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:04.120 23:04:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:04.120 23:04:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:04.380 23:04:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:04.380 23:04:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:04.380 23:04:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:36:04.380 23:04:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:36:04.381 23:04:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:04.381 23:04:49 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:36:04.381 23:04:49 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.V5M4PtrmcU /tmp/tmp.MXKcb2FbLy 00:36:04.381 23:04:49 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:36:04.381 23:04:49 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:36:04.381 23:04:49 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.V5M4PtrmcU /tmp/tmp.MXKcb2FbLy 00:36:04.381 23:04:49 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@117 -- # sync 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@120 -- # set +e 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:04.381 rmmod nvme_tcp 00:36:04.381 rmmod nvme_fabrics 00:36:04.381 rmmod nvme_keyring 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@124 -- # set -e 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@125 -- # return 0 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@489 -- # '[' -n 2915013 ']' 00:36:04.381 23:04:49 chaining -- nvmf/common.sh@490 -- # killprocess 2915013 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@948 -- # '[' -z 2915013 ']' 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@952 -- # kill -0 2915013 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@953 -- # uname 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2915013 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2915013' 00:36:04.381 killing process with pid 2915013 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@967 -- # kill 2915013 00:36:04.381 23:04:49 chaining -- common/autotest_common.sh@972 -- # wait 2915013 00:36:04.950 23:04:49 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:04.950 23:04:49 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:04.950 23:04:49 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:04.950 23:04:49 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:04.950 23:04:49 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:04.950 23:04:49 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:04.950 23:04:49 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:04.950 23:04:49 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:36:04.950 23:04:49 chaining -- bdev/chaining.sh@132 -- # bperfpid=2916133 00:36:04.950 23:04:49 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:04.950 23:04:49 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2916133 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@829 -- # '[' -z 2916133 ']' 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:04.950 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:04.950 23:04:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:04.950 [2024-07-15 23:04:49.672228] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:04.950 [2024-07-15 23:04:49.672302] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2916133 ] 00:36:04.950 [2024-07-15 23:04:49.802691] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:05.209 [2024-07-15 23:04:49.905109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:05.778 23:04:50 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:05.778 23:04:50 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:05.778 23:04:50 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:36:05.778 23:04:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:05.778 23:04:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:06.037 malloc0 00:36:06.037 true 00:36:06.037 true 00:36:06.037 [2024-07-15 23:04:50.746278] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:06.037 crypto0 00:36:06.037 [2024-07-15 23:04:50.754303] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:06.037 crypto1 00:36:06.037 23:04:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:06.037 23:04:50 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:06.037 Running I/O for 5 seconds... 00:36:11.309 00:36:11.309 Latency(us) 00:36:11.309 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:11.309 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:11.309 Verification LBA range: start 0x0 length 0x2000 00:36:11.309 crypto1 : 5.01 11404.85 44.55 0.00 0.00 22379.82 1246.61 14246.96 00:36:11.309 =================================================================================================================== 00:36:11.309 Total : 11404.85 44.55 0.00 0.00 22379.82 1246.61 14246.96 00:36:11.309 0 00:36:11.309 23:04:55 chaining -- bdev/chaining.sh@146 -- # killprocess 2916133 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@948 -- # '[' -z 2916133 ']' 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@952 -- # kill -0 2916133 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@953 -- # uname 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2916133 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2916133' 00:36:11.309 killing process with pid 2916133 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@967 -- # kill 2916133 00:36:11.309 Received shutdown signal, test time was about 5.000000 seconds 00:36:11.309 00:36:11.309 Latency(us) 00:36:11.309 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:11.309 =================================================================================================================== 00:36:11.309 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:11.309 23:04:55 chaining -- common/autotest_common.sh@972 -- # wait 2916133 00:36:11.309 23:04:56 chaining -- bdev/chaining.sh@152 -- # bperfpid=2916963 00:36:11.309 23:04:56 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:11.309 23:04:56 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2916963 00:36:11.309 23:04:56 chaining -- common/autotest_common.sh@829 -- # '[' -z 2916963 ']' 00:36:11.309 23:04:56 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:11.309 23:04:56 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:11.309 23:04:56 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:11.309 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:11.309 23:04:56 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:11.309 23:04:56 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:11.569 [2024-07-15 23:04:56.255792] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:11.569 [2024-07-15 23:04:56.255863] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2916963 ] 00:36:11.569 [2024-07-15 23:04:56.382854] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:11.827 [2024-07-15 23:04:56.489820] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:12.396 23:04:57 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:12.396 23:04:57 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:12.396 23:04:57 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:36:12.396 23:04:57 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:12.396 23:04:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:12.396 malloc0 00:36:12.656 true 00:36:12.656 true 00:36:12.656 [2024-07-15 23:04:57.317012] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:36:12.656 [2024-07-15 23:04:57.317059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:36:12.656 [2024-07-15 23:04:57.317081] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x91e730 00:36:12.656 [2024-07-15 23:04:57.317093] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:36:12.656 [2024-07-15 23:04:57.318177] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:36:12.656 [2024-07-15 23:04:57.318204] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:36:12.656 pt0 00:36:12.656 [2024-07-15 23:04:57.325041] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:12.656 crypto0 00:36:12.656 [2024-07-15 23:04:57.333062] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:36:12.656 crypto1 00:36:12.656 23:04:57 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:12.656 23:04:57 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:12.656 Running I/O for 5 seconds... 00:36:17.941 00:36:17.941 Latency(us) 00:36:17.941 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:17.941 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:17.941 Verification LBA range: start 0x0 length 0x2000 00:36:17.941 crypto1 : 5.02 8921.86 34.85 0.00 0.00 28615.65 6496.61 17666.23 00:36:17.941 =================================================================================================================== 00:36:17.941 Total : 8921.86 34.85 0.00 0.00 28615.65 6496.61 17666.23 00:36:17.941 0 00:36:17.941 23:05:02 chaining -- bdev/chaining.sh@167 -- # killprocess 2916963 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@948 -- # '[' -z 2916963 ']' 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@952 -- # kill -0 2916963 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@953 -- # uname 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2916963 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2916963' 00:36:17.941 killing process with pid 2916963 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@967 -- # kill 2916963 00:36:17.941 Received shutdown signal, test time was about 5.000000 seconds 00:36:17.941 00:36:17.941 Latency(us) 00:36:17.941 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:17.941 =================================================================================================================== 00:36:17.941 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@972 -- # wait 2916963 00:36:17.941 23:05:02 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:36:17.941 23:05:02 chaining -- bdev/chaining.sh@170 -- # killprocess 2916963 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@948 -- # '[' -z 2916963 ']' 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@952 -- # kill -0 2916963 00:36:17.941 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (2916963) - No such process 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 2916963 is not found' 00:36:17.941 Process with pid 2916963 is not found 00:36:17.941 23:05:02 chaining -- bdev/chaining.sh@171 -- # wait 2916963 00:36:17.941 23:05:02 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:36:17.941 23:05:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@296 -- # e810=() 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@297 -- # x722=() 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@298 -- # mlx=() 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@336 -- # return 1 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:36:17.941 WARNING: No supported devices were found, fallback requested for tcp test 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:36:17.941 23:05:02 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:36:17.942 23:05:02 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:36:17.942 23:05:02 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:36:17.942 23:05:02 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:36:17.942 23:05:02 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:36:17.942 23:05:02 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:36:17.942 23:05:02 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:36:17.942 23:05:02 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:36:18.200 Cannot find device "nvmf_tgt_br" 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@155 -- # true 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:36:18.200 Cannot find device "nvmf_tgt_br2" 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@156 -- # true 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:36:18.200 Cannot find device "nvmf_tgt_br" 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@158 -- # true 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:36:18.200 Cannot find device "nvmf_tgt_br2" 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@159 -- # true 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:36:18.200 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@162 -- # true 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:36:18.200 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@163 -- # true 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:36:18.200 23:05:02 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:36:18.200 23:05:03 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:36:18.461 23:05:03 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:36:18.461 23:05:03 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:36:18.461 23:05:03 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:36:18.461 23:05:03 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:36:18.461 23:05:03 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:36:18.461 23:05:03 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:36:18.765 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:36:18.765 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.114 ms 00:36:18.765 00:36:18.765 --- 10.0.0.2 ping statistics --- 00:36:18.765 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:18.765 rtt min/avg/max/mdev = 0.114/0.114/0.114/0.000 ms 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:36:18.765 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:36:18.765 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.079 ms 00:36:18.765 00:36:18.765 --- 10.0.0.3 ping statistics --- 00:36:18.765 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:18.765 rtt min/avg/max/mdev = 0.079/0.079/0.079/0.000 ms 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:36:18.765 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:36:18.765 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.041 ms 00:36:18.765 00:36:18.765 --- 10.0.0.1 ping statistics --- 00:36:18.765 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:36:18.765 rtt min/avg/max/mdev = 0.041/0.041/0.041/0.000 ms 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@433 -- # return 0 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:36:18.765 23:05:03 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@481 -- # nvmfpid=2918133 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@482 -- # waitforlisten 2918133 00:36:18.765 23:05:03 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@829 -- # '[' -z 2918133 ']' 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:18.765 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:18.765 23:05:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:18.765 [2024-07-15 23:05:03.648378] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:18.765 [2024-07-15 23:05:03.648451] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:19.024 [2024-07-15 23:05:03.791823] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:19.024 [2024-07-15 23:05:03.912747] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:36:19.024 [2024-07-15 23:05:03.912817] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:36:19.024 [2024-07-15 23:05:03.912836] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:36:19.024 [2024-07-15 23:05:03.912853] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:36:19.024 [2024-07-15 23:05:03.912867] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:36:19.024 [2024-07-15 23:05:03.912902] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:19.960 23:05:04 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:19.960 23:05:04 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:19.960 23:05:04 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:36:19.960 23:05:04 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:19.960 23:05:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:19.960 23:05:04 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:36:19.960 23:05:04 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:36:19.960 23:05:04 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:19.960 23:05:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:19.960 malloc0 00:36:19.960 [2024-07-15 23:05:04.647305] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:36:19.960 [2024-07-15 23:05:04.663547] tcp.c: 981:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:36:19.960 23:05:04 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:19.961 23:05:04 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:36:19.961 23:05:04 chaining -- bdev/chaining.sh@189 -- # bperfpid=2918300 00:36:19.961 23:05:04 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:36:19.961 23:05:04 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2918300 /var/tmp/bperf.sock 00:36:19.961 23:05:04 chaining -- common/autotest_common.sh@829 -- # '[' -z 2918300 ']' 00:36:19.961 23:05:04 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:36:19.961 23:05:04 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:19.961 23:05:04 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:36:19.961 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:36:19.961 23:05:04 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:19.961 23:05:04 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:19.961 [2024-07-15 23:05:04.736750] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:19.961 [2024-07-15 23:05:04.736816] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2918300 ] 00:36:19.961 [2024-07-15 23:05:04.866985] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:20.218 [2024-07-15 23:05:04.969149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:20.809 23:05:05 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:20.809 23:05:05 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:20.809 23:05:05 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:36:20.809 23:05:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:36:21.374 [2024-07-15 23:05:06.036728] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:21.374 nvme0n1 00:36:21.374 true 00:36:21.374 crypto0 00:36:21.374 23:05:06 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:36:21.374 Running I/O for 5 seconds... 00:36:26.640 00:36:26.640 Latency(us) 00:36:26.640 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:26.640 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:36:26.640 Verification LBA range: start 0x0 length 0x2000 00:36:26.640 crypto0 : 5.03 6866.55 26.82 0.00 0.00 37152.62 3903.67 27240.18 00:36:26.640 =================================================================================================================== 00:36:26.640 Total : 6866.55 26.82 0.00 0.00 37152.62 3903.67 27240.18 00:36:26.640 0 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@205 -- # sequence=69034 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:26.640 23:05:11 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:26.641 23:05:11 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:26.641 23:05:11 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:26.641 23:05:11 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:26.641 23:05:11 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@206 -- # encrypt=34517 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:26.897 23:05:11 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:27.154 23:05:11 chaining -- bdev/chaining.sh@207 -- # decrypt=34517 00:36:27.155 23:05:11 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:36:27.155 23:05:11 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:27.155 23:05:11 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:27.155 23:05:12 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:27.155 23:05:12 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:27.155 23:05:12 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:27.155 23:05:12 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:27.155 23:05:12 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:27.155 23:05:12 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:27.155 23:05:12 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:27.412 23:05:12 chaining -- bdev/chaining.sh@208 -- # crc32c=69034 00:36:27.412 23:05:12 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:36:27.412 23:05:12 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:36:27.412 23:05:12 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:36:27.412 23:05:12 chaining -- bdev/chaining.sh@214 -- # killprocess 2918300 00:36:27.412 23:05:12 chaining -- common/autotest_common.sh@948 -- # '[' -z 2918300 ']' 00:36:27.412 23:05:12 chaining -- common/autotest_common.sh@952 -- # kill -0 2918300 00:36:27.412 23:05:12 chaining -- common/autotest_common.sh@953 -- # uname 00:36:27.412 23:05:12 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:27.412 23:05:12 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2918300 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2918300' 00:36:27.670 killing process with pid 2918300 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@967 -- # kill 2918300 00:36:27.670 Received shutdown signal, test time was about 5.000000 seconds 00:36:27.670 00:36:27.670 Latency(us) 00:36:27.670 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:27.670 =================================================================================================================== 00:36:27.670 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@972 -- # wait 2918300 00:36:27.670 23:05:12 chaining -- bdev/chaining.sh@219 -- # bperfpid=2919359 00:36:27.670 23:05:12 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:36:27.670 23:05:12 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2919359 /var/tmp/bperf.sock 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@829 -- # '[' -z 2919359 ']' 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:36:27.670 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:27.670 23:05:12 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:27.928 [2024-07-15 23:05:12.609963] Starting SPDK v24.09-pre git sha1 4903ec649 / DPDK 24.03.0 initialization... 00:36:27.928 [2024-07-15 23:05:12.610032] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2919359 ] 00:36:27.928 [2024-07-15 23:05:12.738498] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:28.187 [2024-07-15 23:05:12.844840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:28.752 23:05:13 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:28.752 23:05:13 chaining -- common/autotest_common.sh@862 -- # return 0 00:36:28.752 23:05:13 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:36:28.752 23:05:13 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:36:29.318 [2024-07-15 23:05:13.937778] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:36:29.318 nvme0n1 00:36:29.318 true 00:36:29.318 crypto0 00:36:29.318 23:05:13 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:36:29.318 Running I/O for 5 seconds... 00:36:34.592 00:36:34.592 Latency(us) 00:36:34.592 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:34.592 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:36:34.592 Verification LBA range: start 0x0 length 0x200 00:36:34.592 crypto0 : 5.01 1636.76 102.30 0.00 0.00 19173.30 1018.66 20629.59 00:36:34.592 =================================================================================================================== 00:36:34.592 Total : 1636.76 102.30 0.00 0.00 19173.30 1018.66 20629.59 00:36:34.592 0 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@233 -- # sequence=16390 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:36:34.592 23:05:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@234 -- # encrypt=8195 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:34.851 23:05:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@235 -- # decrypt=8195 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:36:35.110 23:05:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:36:35.369 23:05:20 chaining -- bdev/chaining.sh@236 -- # crc32c=16390 00:36:35.369 23:05:20 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:36:35.369 23:05:20 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:36:35.369 23:05:20 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:36:35.369 23:05:20 chaining -- bdev/chaining.sh@242 -- # killprocess 2919359 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@948 -- # '[' -z 2919359 ']' 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@952 -- # kill -0 2919359 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@953 -- # uname 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2919359 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2919359' 00:36:35.369 killing process with pid 2919359 00:36:35.369 23:05:20 chaining -- common/autotest_common.sh@967 -- # kill 2919359 00:36:35.369 Received shutdown signal, test time was about 5.000000 seconds 00:36:35.369 00:36:35.369 Latency(us) 00:36:35.369 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:35.370 =================================================================================================================== 00:36:35.370 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:35.370 23:05:20 chaining -- common/autotest_common.sh@972 -- # wait 2919359 00:36:35.628 23:05:20 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@117 -- # sync 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@120 -- # set +e 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:36:35.628 rmmod nvme_tcp 00:36:35.628 rmmod nvme_fabrics 00:36:35.628 rmmod nvme_keyring 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@124 -- # set -e 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@125 -- # return 0 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@489 -- # '[' -n 2918133 ']' 00:36:35.628 23:05:20 chaining -- nvmf/common.sh@490 -- # killprocess 2918133 00:36:35.628 23:05:20 chaining -- common/autotest_common.sh@948 -- # '[' -z 2918133 ']' 00:36:35.628 23:05:20 chaining -- common/autotest_common.sh@952 -- # kill -0 2918133 00:36:35.628 23:05:20 chaining -- common/autotest_common.sh@953 -- # uname 00:36:35.628 23:05:20 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:35.628 23:05:20 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2918133 00:36:35.886 23:05:20 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:35.886 23:05:20 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:35.886 23:05:20 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2918133' 00:36:35.886 killing process with pid 2918133 00:36:35.886 23:05:20 chaining -- common/autotest_common.sh@967 -- # kill 2918133 00:36:35.886 23:05:20 chaining -- common/autotest_common.sh@972 -- # wait 2918133 00:36:36.145 23:05:20 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:36:36.145 23:05:20 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:36:36.145 23:05:20 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:36:36.145 23:05:20 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:36:36.145 23:05:20 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:36:36.145 23:05:20 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:36:36.145 23:05:20 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:36:36.145 23:05:20 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:36:36.145 23:05:20 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:36:36.145 23:05:20 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:36:36.145 00:36:36.145 real 0m47.376s 00:36:36.145 user 1m0.071s 00:36:36.145 sys 0m14.426s 00:36:36.145 23:05:20 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:36.145 23:05:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:36:36.145 ************************************ 00:36:36.145 END TEST chaining 00:36:36.145 ************************************ 00:36:36.145 23:05:20 -- common/autotest_common.sh@1142 -- # return 0 00:36:36.145 23:05:20 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:36:36.145 23:05:20 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:36:36.145 23:05:20 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:36:36.145 23:05:20 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:36:36.145 23:05:20 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:36:36.145 23:05:20 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:36:36.145 23:05:20 -- common/autotest_common.sh@722 -- # xtrace_disable 00:36:36.145 23:05:20 -- common/autotest_common.sh@10 -- # set +x 00:36:36.145 23:05:20 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:36:36.145 23:05:20 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:36:36.145 23:05:20 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:36:36.145 23:05:20 -- common/autotest_common.sh@10 -- # set +x 00:36:41.416 INFO: APP EXITING 00:36:41.416 INFO: killing all VMs 00:36:41.416 INFO: killing vhost app 00:36:41.416 INFO: EXIT DONE 00:36:44.734 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:44.734 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:44.734 Waiting for block devices as requested 00:36:44.734 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:36:44.734 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:44.993 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:44.993 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:44.993 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:45.251 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:45.251 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:45.251 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:45.510 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:45.510 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:45.510 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:45.770 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:45.770 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:45.770 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:46.029 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:46.029 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:46.029 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:50.221 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:50.221 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:50.221 Cleaning 00:36:50.221 Removing: /var/run/dpdk/spdk0/config 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:36:50.221 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:36:50.221 Removing: /var/run/dpdk/spdk0/hugepage_info 00:36:50.221 Removing: /dev/shm/nvmf_trace.0 00:36:50.221 Removing: /dev/shm/spdk_tgt_trace.pid2656311 00:36:50.221 Removing: /var/run/dpdk/spdk0 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2655370 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2656311 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2656846 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2657582 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2657769 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2658557 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2658708 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2658996 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2661649 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2663612 00:36:50.221 Removing: /var/run/dpdk/spdk_pid2663846 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2664148 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2664492 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2664731 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2664927 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2665122 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2665388 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2666094 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2668813 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2669048 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2669404 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2669627 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2669658 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2669884 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2670077 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2670281 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2670489 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2670813 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2671025 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2671229 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2671423 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2671624 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2671845 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2672159 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2672380 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2672575 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2672774 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2672970 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2673267 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2673524 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2673725 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2673930 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2674121 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2674338 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2674680 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2674993 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2675251 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2675615 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2675986 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2676208 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2676560 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2676930 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2676999 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2677420 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2677888 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2678223 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2678301 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2682426 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2684123 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2685813 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2686631 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2687779 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2688147 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2688170 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2688201 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2692583 00:36:50.479 Removing: /var/run/dpdk/spdk_pid2693042 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2694083 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2694298 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2699800 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2701947 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2702800 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2707005 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2709059 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2710130 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2714377 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2717313 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2718295 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2728013 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2730228 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2731378 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2741275 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2743994 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2744978 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2754851 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2758121 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2759135 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2771371 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2773993 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2775122 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2786710 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2789185 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2790292 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2802023 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2806164 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2807224 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2808371 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2811444 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2816589 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2819211 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2824407 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2827671 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2833194 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2835859 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2842027 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2844290 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2851615 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2853878 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2860423 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2862915 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2867073 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2867427 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2867781 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2868147 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2868710 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2869482 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2870203 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2870641 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2872407 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2874052 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2876127 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2877438 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2879198 00:36:50.738 Removing: /var/run/dpdk/spdk_pid2880802 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2882443 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2883734 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2884406 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2884780 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2886960 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2888817 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2890661 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2891720 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2892819 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2893413 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2893521 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2893594 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2893957 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2893984 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2895217 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2896725 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2898225 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2899234 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2900327 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2900528 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2900674 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2900740 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2901679 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2902211 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2902588 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2904721 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2906525 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2908301 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2909364 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2910596 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2911135 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2911313 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2915235 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2915442 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2915639 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2915784 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2915897 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2916133 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2916963 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2918300 00:36:50.997 Removing: /var/run/dpdk/spdk_pid2919359 00:36:50.997 Clean 00:36:51.257 23:05:35 -- common/autotest_common.sh@1451 -- # return 0 00:36:51.257 23:05:35 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:36:51.257 23:05:35 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:51.257 23:05:35 -- common/autotest_common.sh@10 -- # set +x 00:36:51.257 23:05:35 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:36:51.257 23:05:35 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:51.257 23:05:35 -- common/autotest_common.sh@10 -- # set +x 00:36:51.257 23:05:36 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:51.257 23:05:36 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:36:51.257 23:05:36 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:36:51.257 23:05:36 -- spdk/autotest.sh@391 -- # hash lcov 00:36:51.257 23:05:36 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:36:51.257 23:05:36 -- spdk/autotest.sh@393 -- # hostname 00:36:51.257 23:05:36 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:36:51.516 geninfo: WARNING: invalid characters removed from testname! 00:37:23.600 23:06:03 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:23.600 23:06:07 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:24.978 23:06:09 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:27.513 23:06:12 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:30.048 23:06:14 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:33.335 23:06:17 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:37:35.863 23:06:20 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:37:35.863 23:06:20 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:35.863 23:06:20 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:37:35.863 23:06:20 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:35.863 23:06:20 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:35.863 23:06:20 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:35.863 23:06:20 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:35.863 23:06:20 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:35.863 23:06:20 -- paths/export.sh@5 -- $ export PATH 00:37:35.863 23:06:20 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:35.863 23:06:20 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:35.863 23:06:20 -- common/autobuild_common.sh@444 -- $ date +%s 00:37:35.863 23:06:20 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721077580.XXXXXX 00:37:35.863 23:06:20 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721077580.ykBKmi 00:37:35.863 23:06:20 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:37:35.863 23:06:20 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:37:35.863 23:06:20 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:37:35.863 23:06:20 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:37:35.863 23:06:20 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:37:35.863 23:06:20 -- common/autobuild_common.sh@460 -- $ get_config_params 00:37:35.863 23:06:20 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:37:35.863 23:06:20 -- common/autotest_common.sh@10 -- $ set +x 00:37:35.863 23:06:20 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:37:35.863 23:06:20 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:37:35.863 23:06:20 -- pm/common@17 -- $ local monitor 00:37:35.863 23:06:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:35.863 23:06:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:35.863 23:06:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:35.863 23:06:20 -- pm/common@21 -- $ date +%s 00:37:35.863 23:06:20 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:35.863 23:06:20 -- pm/common@21 -- $ date +%s 00:37:35.863 23:06:20 -- pm/common@25 -- $ sleep 1 00:37:35.863 23:06:20 -- pm/common@21 -- $ date +%s 00:37:35.863 23:06:20 -- pm/common@21 -- $ date +%s 00:37:35.863 23:06:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721077580 00:37:35.863 23:06:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721077580 00:37:35.863 23:06:20 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721077580 00:37:35.863 23:06:20 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721077580 00:37:35.863 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721077580_collect-vmstat.pm.log 00:37:35.863 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721077580_collect-cpu-load.pm.log 00:37:35.863 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721077580_collect-cpu-temp.pm.log 00:37:35.863 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721077580_collect-bmc-pm.bmc.pm.log 00:37:36.430 23:06:21 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:37:36.430 23:06:21 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:37:36.430 23:06:21 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:36.430 23:06:21 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:37:36.430 23:06:21 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:37:36.430 23:06:21 -- spdk/autopackage.sh@19 -- $ timing_finish 00:37:36.430 23:06:21 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:37:36.430 23:06:21 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:37:36.430 23:06:21 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:37:36.688 23:06:21 -- spdk/autopackage.sh@20 -- $ exit 0 00:37:36.688 23:06:21 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:37:36.688 23:06:21 -- pm/common@29 -- $ signal_monitor_resources TERM 00:37:36.688 23:06:21 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:37:36.688 23:06:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:36.688 23:06:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:37:36.688 23:06:21 -- pm/common@44 -- $ pid=2930673 00:37:36.688 23:06:21 -- pm/common@50 -- $ kill -TERM 2930673 00:37:36.688 23:06:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:36.688 23:06:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:37:36.688 23:06:21 -- pm/common@44 -- $ pid=2930674 00:37:36.688 23:06:21 -- pm/common@50 -- $ kill -TERM 2930674 00:37:36.688 23:06:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:36.688 23:06:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:37:36.688 23:06:21 -- pm/common@44 -- $ pid=2930676 00:37:36.688 23:06:21 -- pm/common@50 -- $ kill -TERM 2930676 00:37:36.688 23:06:21 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:37:36.688 23:06:21 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:37:36.688 23:06:21 -- pm/common@44 -- $ pid=2930704 00:37:36.688 23:06:21 -- pm/common@50 -- $ sudo -E kill -TERM 2930704 00:37:36.688 + [[ -n 2533995 ]] 00:37:36.688 + sudo kill 2533995 00:37:36.696 [Pipeline] } 00:37:36.709 [Pipeline] // stage 00:37:36.713 [Pipeline] } 00:37:36.727 [Pipeline] // timeout 00:37:36.731 [Pipeline] } 00:37:36.742 [Pipeline] // catchError 00:37:36.746 [Pipeline] } 00:37:36.763 [Pipeline] // wrap 00:37:36.767 [Pipeline] } 00:37:36.778 [Pipeline] // catchError 00:37:36.786 [Pipeline] stage 00:37:36.788 [Pipeline] { (Epilogue) 00:37:36.803 [Pipeline] catchError 00:37:36.804 [Pipeline] { 00:37:36.819 [Pipeline] echo 00:37:36.821 Cleanup processes 00:37:36.827 [Pipeline] sh 00:37:37.109 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:37.109 2930780 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:37:37.109 2930996 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:37.124 [Pipeline] sh 00:37:37.408 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:37.408 ++ grep -v 'sudo pgrep' 00:37:37.408 ++ awk '{print $1}' 00:37:37.408 + sudo kill -9 2930780 00:37:37.421 [Pipeline] sh 00:37:37.702 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:37:52.600 [Pipeline] sh 00:37:52.944 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:37:52.944 Artifacts sizes are good 00:37:52.961 [Pipeline] archiveArtifacts 00:37:52.969 Archiving artifacts 00:37:53.269 [Pipeline] sh 00:37:53.553 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:37:53.568 [Pipeline] cleanWs 00:37:53.578 [WS-CLEANUP] Deleting project workspace... 00:37:53.578 [WS-CLEANUP] Deferred wipeout is used... 00:37:53.585 [WS-CLEANUP] done 00:37:53.587 [Pipeline] } 00:37:53.605 [Pipeline] // catchError 00:37:53.616 [Pipeline] sh 00:37:53.898 + logger -p user.info -t JENKINS-CI 00:37:53.907 [Pipeline] } 00:37:53.923 [Pipeline] // stage 00:37:53.928 [Pipeline] } 00:37:53.943 [Pipeline] // node 00:37:53.947 [Pipeline] End of Pipeline 00:37:54.085 Finished: SUCCESS